Fundamentals of Information Theory · Communications · GATE ECE
Start PracticeMarks 1
GATE ECE 2024
A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _______ .
GATE ECE 2017 Set 2
Which one of the following graphs shows the Shannon capacity (channel capacity) in bits of a memory less binary symmetric channel with crossover proba...
GATE ECE 2017 Set 1
Let $$\left( {{X_1},\,{X_2}} \right)$$ be independent random variables, $${X_1}$$ has mean 0 and variance 1, while $${X_2}$$ has mean 1 and variance 4...
GATE ECE 2016 Set 3
An analog baseband signal, band limited to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur inde...
GATE ECE 2016 Set 2
A discrete memoryless source has an alphabet $$({a_1},\,{a_2},\,{a_3},\,{a_4})\,$$ with corresponding probabilities$$\left( {{1 \over 2}\,\,,{1 \over ...
GATE ECE 2012
A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of th...
GATE ECE 2012
In a base band communications link, frequencies up to 3500 Hz are used for signaling. Using a raised cosine pulse with 75% excess bandwidth and for no...
GATE ECE 2011
An analog signal is band-limited to 4kHz. sampled at the Nyquist rate and the samples are quantized into 4 levels. The quantized levels are assumed to...
Marks 2
GATE ECE 2023
The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" que...
GATE ECE 2022
The transition diagram of a discrete memoryless channel with three input symbols and three output symbols is shown in the figure. The transition proba...
GATE ECE 2022
Consider communication over a memoryless binary symmetric channel using a (7, 4) Hamming code. Each transmitted bit is received correctly with probabi...
GATE ECE 2017 Set 2
Consider a binary memoryless channel characterized by the transition probability diagram shown in the figure.
The channel is...
GATE ECE 2016 Set 1
Consider a discreet memoryless source with alphabet $$S = \left\{ {{s_0},\,{s_1},\,{s_2},\,{s_3},\,{s_{4......}}} \right\}$$ and respective probabilit...
GATE ECE 2016 Set 3
A voice-grade AWGN (additive white Gaussian noise) telephone channel has a bandwidth of 4.0 kHz and two-sided noise power spectral density $${\eta \o...
GATE ECE 2016 Set 2
A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events:
$${x_0}$$ : a " ze...
GATE ECE 2014 Set 1
A fair coin is tossed repeatedly until a 'Head' appears for the first time. Let L be the number of tosses to get this first 'Head'. The entropy H (L) ...
GATE ECE 2014 Set 4
Consider the Z- channel given in the figure. The input is 0 or 1 with equal probability.
If the output is 0, the probability that the input is also 0...
GATE ECE 2014 Set 2
The capacity of band-limited additive white Gaussian noise (AWGN) channel is given by $$C = \,W\,\,{\log _2}\left( {1 + {P \over {{\sigma ^2}\,W}}} \r...
GATE ECE 2009
A communication channel with AWGN operating at a signal to noise ratio SNR > > 1 and band width B has capacity $${{C_1}}$$ . If the SNR is doubl...
GATE ECE 2008
A memory less source emits n symbols each with a probability p. The entropy of the source as a function of n
GATE ECE 2006
A source generates three symbols with probabilities 0.25, 0.25, 0.50 at a rate of 3000 symbols per second. Assuming independent generation of symbols,...
GATE ECE 2001
A video transmission system transmits 625 picture frames per second. Each frame consists of a $$400\,\, \times \,\,400$$ pixel grid with 64 intensity ...
GATE ECE 1991
A binary source has symbol probabilities 0.8 and 0.2. If extension coding (blocks of 4 symbols) is used, the lower and upper bounds on the average cod...
GATE ECE 1990
An image uses $$512\, \times \,512$$ picture elements. Each of the picture elements can take any of the 8 distinguishable intersity levels. The maximu...
GATE ECE 1989
A source produces 4 symbols with probabilities $${1 \over 2},\,{1 \over 4},\,{1 \over 8}\,\,and\,\,{1 \over 8}.$$ For this source, a practical coding ...