1
GATE ECE 2025
MCQ (Single Correct Answer)
+2
-0.67

A source transmits symbol $S$ that takes values uniformly at random from the set $\{-2,0,2\}$. The receiver obtains $Y=S+N$, where $N$ is a zero-mean Gaussian random variable independent of $S$. The receiver uses the maximum likelihood decoder to estimate the transmitted symbol $S$.

Suppose the probability of symbol estimation error $P_e$ is expressed as follows:

$$ P_e=\alpha P(N>1), $$

where $P(N>1)$ denotes the probability that $N$ exceeds 1 .

What is the value of $\alpha$ ?

A
$\frac{1}{3}$
B
1
C
$\frac{2}{3}$
D
$\frac{4}{3}$
2
GATE ECE 2025
MCQ (Single Correct Answer)
+2
-0.67

Consider a real-valued random process

$$ f(t)=\sum\limits_{n=1}^N a_n p(t-n T), $$

where $T>0$ and $N$ is a positive integer. Here, $p(t)=1$ for $t \in[0,0.5 T]$ and 0 otherwise. The coefficients $a_n$ are pairwise independent, zero-mean unit-variance random variables. Read the following statements about the random process and choose the correct option.

(i) The mean of the process $f(t)$ is independent of time $t$.

(ii) The autocorrelation function $E[f(t) f(t+\tau)]$ is independent of time $t$ for all $\tau$. (Here, $E[\cdot]$ is the expectation operation.)

A
(i) is TRUE and (ii) is FALSE
B
Both (i) and (ii) are TRUE
C
Both (i) and (ii) are FALSE
D
(i) is FALSE and (ii) is TRUE
3
GATE ECE 2025
MCQ (More than One Correct Answer)
+2
-0

The random variable $X$ takes values in $\{-1,0,1\}$ with probabilities $P(X=-1)=P(X=1)$ and $\alpha$ and $P(X=0)=1-2 \alpha$, where $0<\alpha<\frac{1}{2}$.

Let $g(\alpha)$ denote the entropy of $X$ (in bits), parameterized by $\alpha$. Which of the following statements is/are TRUE?

A
$g(0.4)>g(0.3)$
B
$g(0.3)>g(0.4)$
C
$g(0.3)>g(0.25)$
D
$g(0.25)>g(0.3)$
4
GATE ECE 2025
Numerical
+2
-0

$X$ and $Y$ are Bernoulli random variables taking values in $\{0,1\}$. The joint probability mass function of the random variables is given by:

$$ \begin{aligned} & P(X=0, Y=0)=0.06 \\ & P(X=0, Y=1)=0.14 \\ & P(X=1, Y=0)=0.24 \\ & P(X=1, Y=1)=0.56 \end{aligned} $$

The mutual information $I(X ; Y)$ is ___________(rounded off to two decimal places).

Your input ____
EXAM MAP