Unit 6 - Practice Quiz

ECE180

1 What is the primary cause of thermal noise in a resistor?

A. Random emission of electrons
B. Random thermal motion of electrons
C. Improper doping of the semiconductor
D. External electromagnetic interference

2 The Mean Square Noise Voltage generated by a resistor at temperature (Kelvin) over a bandwidth is given by:

A.
B.
C.
D.

3 Which of the following characteristics best describes thermal noise?

A. It is band-limited noise
B. It is impulsive noise
C. It is white noise (constant PSD)
D. It is strictly low-frequency noise

4 If two resistors and are connected in series, what is the effective noise resistance?

A.
B.
C.
D.

5 The standard reference temperature used in noise figure calculations is typically:

A. 0 K
B. 100 K
C. 273 K
D. 290 K

6 The Noise Figure () of a network is defined as:

A. The ratio of output SNR to input SNR
B. The ratio of input SNR to output SNR
C. The difference between input SNR and output SNR
D. The product of input SNR and output SNR

7 The relationship between Noise Figure () and Effective Noise Temperature () is given by:

A.
B.
C.
D.

8 For a cascaded system of two stages with noise figures and and gains and , the Friis formula for the total noise figure is:

A.
B.
C.
D.

9 In a cascaded amplifier system, which stage contributes most significantly to the overall system noise performance?

A. The last stage
B. The intermediate stage
C. The first stage
D. All stages contribute equally

10 The Noise Equivalent Bandwidth () of a system with transfer function and maximum gain is defined as:

A.
B.
C.
D.

11 Narrow band noise is generally represented in quadrature form as:

A.
B.
C.
D.

12 If a narrow band noise has zero mean, what is the mean value of its quadrature components and ?

A. Zero
B. Unity
C. Depends on bandwidth
D. Infinite

13 If the variance of the narrow band noise is , what is the variance of the in-phase component ?

A.
B.
C.
D.

14 The Power Spectral Density (PSD) of the in-phase component is related to the PSD of the narrow band noise by:

A.
B.
C. for
D.

15 For a Gaussian narrow band noise, the probability density function (PDF) of the envelope follows a:

A. Gaussian distribution
B. Uniform distribution
C. Rayleigh distribution
D. Ricean distribution

16 For a Gaussian narrow band noise, the phase is distributed:

A. Normally between and
B. Uniformly between and
C. Rayleigh distributed
D. Exponentially

17 If a constant sine wave signal is added to narrow band Gaussian noise, the envelope of the resultant signal follows a:

A. Gaussian distribution
B. Rayleigh distribution
C. Ricean distribution
D. Poisson distribution

18 The information content of an event with probability is defined as:

A.
B.
C.
D.

19 The unit of information when the logarithm base is 2 is:

A. Nat
B. Decit
C. Bit
D. Hartley

20 Entropy of a discrete random variable represents:

A. The maximum amplitude of the signal
B. The average information content per symbol
C. The bandwidth of the signal
D. The transmission error rate

21 The entropy of a source is maximum when the symbol probabilities are:

A. All zero
B. All one
C. Equiprobable (Uniform)
D. Exponentially distributed

22 If a source has equiprobable symbols, its entropy is:

A.
B.
C.
D. $0$

23 Information Rate is defined as:

A. , where is the symbol rate
B.
C.
D.

24 Calculate the entropy of a source with two symbols having probabilities and .

A. 1 bit/symbol
B. 0.811 bits/symbol
C. 0.5 bits/symbol
D. 0.25 bits/symbol

25 What is the primary goal of Source Coding (e.g., Huffman)?

A. To increase redundancy for error correction
B. To reduce the average number of bits per symbol
C. To increase the bandwidth
D. To increase the signal power

26 A code is said to be a Prefix Code (or Instantaneous Code) if:

A. All codewords have the same length
B. No codeword is a prefix of another codeword
C. It contains a start bit and stop bit
D. The length of codewords increases with probability

27 Which of the following coding techniques guarantees the lowest average code length (optimal) for symbol-by-symbol coding?

A. Shannon-Fano Coding
B. Huffman Coding
C. ASCII Coding
D. Binary Coded Decimal

28 In Huffman coding, high probability symbols are assigned:

A. Longer codewords
B. Shorter codewords
C. Codes starting with 1
D. Codes starting with 0

29 The efficiency of a source code is given by:

A.
B.
C.
D.

30 The Mutual Information represents:

A. The uncertainty remaining in X after observing Y
B. The information shared between variables X and Y
C. The sum of entropies of X and Y
D. The noise in the channel

31 Mutual Information can be expressed in terms of entropy as:

A.
B.
C.
D.

32 If X and Y are independent random variables, their Mutual Information is:

A.
B.
C.
D.

33 The Channel Capacity of a discrete memoryless channel is defined as:

A. The maximum of Mutual Information over all input distributions
B. The bandwidth of the channel
C. The signal-to-noise ratio
D. The minimum entropy of the source

34 For a Binary Symmetric Channel (BSC) with error probability , the capacity is:

A.
B.
C.
D.

35 What is the capacity of a noiseless binary channel?

A. 0 bits/use
B. 0.5 bits/use
C. 1 bit/use
D. Infinite

36 The Shannon-Hartley Law for the capacity of a Gaussian channel with bandwidth and signal-to-noise ratio is:

A.
B.
C.
D.

37 According to the Shannon-Hartley law, if the Bandwidth tends to infinity, the capacity :

A. Becomes infinite
B. Approaches zero
C. Approaches a finite limit proportional to Signal Power
D. Oscillates

38 The Shannon limit for reliable communication states that the minimum required is:

A. 0 dB
B. -1.6 dB ()
C. 3 dB
D. 10 dB

39 The trade-off between Bandwidth and SNR in the Shannon-Hartley law implies that:

A. Bandwidth and SNR are linearly exchangeable
B. Bandwidth can be traded for SNR logarithmically
C. No trade-off is possible
D. Increasing Bandwidth always reduces Capacity

40 Kraft's Inequality is a necessary and sufficient condition for the existence of:

A. A uniquely decodable prefix code
B. A channel with zero error
C. A Gaussian noise source
D. Infinite bandwidth

41 In Shannon-Fano coding, the first step is to:

A. Arrange symbols in decreasing order of probability
B. Arrange symbols in increasing order of probability
C. Assign 0 to all symbols
D. Calculate the cumulative distribution function

42 Redundancy in a code is defined as:

A. (where is efficiency)
B.
C.
D.

43 The conditional entropy represents:

A. The average uncertainty about Y given X is known
B. The average uncertainty about X given Y is known
C. The joint uncertainty of X and Y
D. The information lost in the channel

44 If a channel has a bandwidth of 3000 Hz and an SNR of 30 dB, the approximate capacity is:

A. 30,000 bits/sec
B. 3,000 bits/sec
C. 9,965 bits/sec
D. 1,000 bits/sec

45 Which noise source is associated with the discrete nature of charge carriers crossing a barrier?

A. Thermal noise
B. Shot noise
C. Flicker noise
D. Cosmic noise

46 Flicker noise, or 1/f noise, becomes dominant at:

A. Very high frequencies
B. Low frequencies
C. Microwave frequencies
D. Temperatures near absolute zero

47 The correlation between the in-phase component and the quadrature component of narrow band noise at the same time instant is:

A. 0 (Uncorrelated)
B. 1 (Fully correlated)
C. Infinite
D. 0.5

48 If the Noise Figure of an amplifier is 3 dB, what is the output SNR if the input SNR is 20 dB?

A. 23 dB
B. 17 dB
C. 60 dB
D. 6.66 dB

49 What is the entropy of a deterministic event (Probability = 1)?

A. 1 bit
B. 0 bits
C. Infinite
D. 0.5 bits

50 The quantity is known as:

A. Joint Entropy
B. Conditional Entropy
C. Mutual Information
D. Differential Entropy