1What is the primary cause of thermal noise in a resistor?
A.Random emission of electrons
B.Random thermal motion of electrons
C.Improper doping of the semiconductor
D.External electromagnetic interference
Correct Answer: Random thermal motion of electrons
Explanation:Thermal noise, also known as Johnson-Nyquist noise, is generated by the random thermal agitation of charge carriers (electrons) inside an electrical conductor at equilibrium.
Incorrect! Try again.
2The Mean Square Noise Voltage generated by a resistor at temperature (Kelvin) over a bandwidth is given by:
A.
B.
C.
D.
Correct Answer:
Explanation:According to the Johnson-Nyquist noise formula, the mean square voltage is , where is Boltzmann's constant.
Incorrect! Try again.
3Which of the following characteristics best describes thermal noise?
A.It is band-limited noise
B.It is impulsive noise
C.It is white noise (constant PSD)
D.It is strictly low-frequency noise
Correct Answer: It is white noise (constant PSD)
Explanation:Thermal noise has a Power Spectral Density (PSD) that is constant across a very wide range of frequencies, effectively making it White Noise for most practical radio frequency applications.
Incorrect! Try again.
4If two resistors and are connected in series, what is the effective noise resistance?
A.
B.
C.
D.
Correct Answer:
Explanation:Since the noise voltages generated by separate resistors are uncorrelated, their mean square voltages add up. This results in an equivalent resistance of generating the noise.
Incorrect! Try again.
5The standard reference temperature used in noise figure calculations is typically:
A.0 K
B.100 K
C.273 K
D.290 K
Correct Answer: 290 K
Explanation:The standard IEEE reference temperature for noise figure measurements is 290 Kelvin (approximately room temperature).
Incorrect! Try again.
6The Noise Figure () of a network is defined as:
A.The ratio of output SNR to input SNR
B.The ratio of input SNR to output SNR
C.The difference between input SNR and output SNR
D.The product of input SNR and output SNR
Correct Answer: The ratio of input SNR to output SNR
Explanation:Noise Figure is defined as . Since the network adds noise, , making .
Incorrect! Try again.
7The relationship between Noise Figure () and Effective Noise Temperature () is given by:
A.
B.
C.
D.
Correct Answer:
Explanation:The effective noise temperature is related to the noise factor by , where is the standard reference temperature.
Incorrect! Try again.
8For a cascaded system of two stages with noise figures and and gains and , the Friis formula for the total noise figure is:
A.
B.
C.
D.
Correct Answer:
Explanation:Friis formula states that the total noise factor is dominated by the first stage: .
Incorrect! Try again.
9In a cascaded amplifier system, which stage contributes most significantly to the overall system noise performance?
A.The last stage
B.The intermediate stage
C.The first stage
D.All stages contribute equally
Correct Answer: The first stage
Explanation:According to the Friis formula, the noise contribution of subsequent stages is divided by the gain of the preceding stages. Therefore, the first stage (usually the LNA) is the most critical.
Incorrect! Try again.
10The Noise Equivalent Bandwidth () of a system with transfer function and maximum gain is defined as:
A.
B.
C.
D.
Correct Answer:
Explanation:The Noise Equivalent Bandwidth is the bandwidth of an ideal rectangular filter that would pass the same amount of white noise power as the actual filter, normalized to the maximum power gain .
Incorrect! Try again.
11Narrow band noise is generally represented in quadrature form as:
A.
B.
C.
D.
Correct Answer:
Explanation:This is the standard canonical representation of narrow band noise, where is the in-phase component and is the quadrature component.
Incorrect! Try again.
12If a narrow band noise has zero mean, what is the mean value of its quadrature components and ?
A.Zero
B.Unity
C.Depends on bandwidth
D.Infinite
Correct Answer: Zero
Explanation:The in-phase and quadrature components of a zero-mean stationary Gaussian narrow band noise process also have zero mean.
Incorrect! Try again.
13If the variance of the narrow band noise is , what is the variance of the in-phase component ?
A.
B.
C.
D.
Correct Answer:
Explanation:For narrow band noise, the variance (power) of the in-phase component, the quadrature component, and the total noise process are equal: .
Incorrect! Try again.
14The Power Spectral Density (PSD) of the in-phase component is related to the PSD of the narrow band noise by:
A.
B.
C. for
D.
Correct Answer:
Explanation:The PSD of the low-pass components ( and ) is obtained by shifting the bandpass PSD to the origin (both positive and negative frequency parts shift) for frequencies .
Incorrect! Try again.
15For a Gaussian narrow band noise, the probability density function (PDF) of the envelope follows a:
A.Gaussian distribution
B.Uniform distribution
C.Rayleigh distribution
D.Ricean distribution
Correct Answer: Rayleigh distribution
Explanation:If and are independent Gaussian variables with zero mean and equal variance, the envelope follows a Rayleigh distribution.
Incorrect! Try again.
16For a Gaussian narrow band noise, the phase is distributed:
A.Normally between and
B.Uniformly between and
C.Rayleigh distributed
D.Exponentially
Correct Answer: Uniformly between and
Explanation:Due to the circular symmetry of the joint Gaussian distribution of and , the phase angle is uniformly distributed over the interval .
Incorrect! Try again.
17If a constant sine wave signal is added to narrow band Gaussian noise, the envelope of the resultant signal follows a:
A.Gaussian distribution
B.Rayleigh distribution
C.Ricean distribution
D.Poisson distribution
Correct Answer: Ricean distribution
Explanation:When a deterministic sinusoidal signal exists along with narrowband Gaussian noise, the envelope distribution changes from Rayleigh to Ricean (or Rician).
Incorrect! Try again.
18The information content of an event with probability is defined as:
A.
B.
C.
D.
Correct Answer:
Explanation:Information content is inversely related to probability. The definition is . This ensures lower probability events convey more information.
Incorrect! Try again.
19The unit of information when the logarithm base is 2 is:
A.Nat
B.Decit
C.Bit
D.Hartley
Correct Answer: Bit
Explanation:Log base 2 corresponds to 'Bits'. Base is 'Nats', and base 10 is 'Hartleys' or 'Decits'.
Incorrect! Try again.
20Entropy of a discrete random variable represents:
A.The maximum amplitude of the signal
B.The average information content per symbol
C.The bandwidth of the signal
D.The transmission error rate
Correct Answer: The average information content per symbol
Explanation:Entropy is the expected value (average) of the information content of the possible outcomes: .
Incorrect! Try again.
21The entropy of a source is maximum when the symbol probabilities are:
A.All zero
B.All one
C.Equiprobable (Uniform)
D.Exponentially distributed
Correct Answer: Equiprobable (Uniform)
Explanation:Entropy is maximized when there is maximum uncertainty, which occurs when all symbols are equally likely (Uniform distribution).
Incorrect! Try again.
22If a source has equiprobable symbols, its entropy is:
A.
B.
C.
D.$0$
Correct Answer:
Explanation:For equiprobable symbols, . .
Incorrect! Try again.
23Information Rate is defined as:
A., where is the symbol rate
B.
C.
D.
Correct Answer: , where is the symbol rate
Explanation:The information rate is the product of the rate at which symbols are generated ( symbols/sec) and the average information per symbol ( bits/symbol).
Incorrect! Try again.
24Calculate the entropy of a source with two symbols having probabilities and .
A.1 bit/symbol
B.0.811 bits/symbol
C.0.5 bits/symbol
D.0.25 bits/symbol
Correct Answer: 0.811 bits/symbol
Explanation:. . . Total .
Incorrect! Try again.
25What is the primary goal of Source Coding (e.g., Huffman)?
A.To increase redundancy for error correction
B.To reduce the average number of bits per symbol
C.To increase the bandwidth
D.To increase the signal power
Correct Answer: To reduce the average number of bits per symbol
Explanation:Source coding aims to compress data by removing redundancy, thereby minimizing the average code length required to represent the source symbols.
Incorrect! Try again.
26A code is said to be a Prefix Code (or Instantaneous Code) if:
A.All codewords have the same length
B.No codeword is a prefix of another codeword
C.It contains a start bit and stop bit
D.The length of codewords increases with probability
Correct Answer: No codeword is a prefix of another codeword
Explanation:The prefix property ensures that a codeword can be decoded as soon as its bits are received, without waiting for subsequent bits, ensuring instantaneous decoding.
Incorrect! Try again.
27Which of the following coding techniques guarantees the lowest average code length (optimal) for symbol-by-symbol coding?
A.Shannon-Fano Coding
B.Huffman Coding
C.ASCII Coding
D.Binary Coded Decimal
Correct Answer: Huffman Coding
Explanation:Huffman coding is proven to be optimal for symbol-by-symbol coding, producing the shortest average code length for a given probability distribution.
Incorrect! Try again.
28In Huffman coding, high probability symbols are assigned:
A.Longer codewords
B.Shorter codewords
C.Codes starting with 1
D.Codes starting with 0
Correct Answer: Shorter codewords
Explanation:To minimize the average length, Huffman coding assigns shorter bit sequences to more frequent symbols and longer sequences to less frequent ones.
Incorrect! Try again.
29The efficiency of a source code is given by:
A.
B.
C.
D.
Correct Answer:
Explanation:Efficiency is the ratio of the source entropy (theoretical minimum length) to the actual average code length produced by the coding scheme.
Incorrect! Try again.
30The Mutual Information represents:
A.The uncertainty remaining in X after observing Y
B.The information shared between variables X and Y
C.The sum of entropies of X and Y
D.The noise in the channel
Correct Answer: The information shared between variables X and Y
Explanation:Mutual information quantifies the reduction in uncertainty about one variable given knowledge of the other. .
Incorrect! Try again.
31Mutual Information can be expressed in terms of entropy as:
A.
B.
C.
D.
Correct Answer:
Explanation:Mutual information is the initial uncertainty of X minus the uncertainty of X remaining after observing Y.
Incorrect! Try again.
32If X and Y are independent random variables, their Mutual Information is:
A.
B.
C.
D.
Correct Answer:
Explanation:If variables are independent, knowing Y gives no information about X. Thus, the reduction in uncertainty is zero.
Incorrect! Try again.
33The Channel Capacity of a discrete memoryless channel is defined as:
A.The maximum of Mutual Information over all input distributions
B.The bandwidth of the channel
C.The signal-to-noise ratio
D.The minimum entropy of the source
Correct Answer: The maximum of Mutual Information over all input distributions
Explanation:Channel Capacity is the maximum rate at which information can be reliably transmitted, defined mathematically as .
Incorrect! Try again.
34For a Binary Symmetric Channel (BSC) with error probability , the capacity is:
A.
B.
C.
D.
Correct Answer:
Explanation:The capacity of a BSC is 1 (bit) minus the entropy of the error probability, often denoted as the binary entropy function or .
Incorrect! Try again.
35What is the capacity of a noiseless binary channel?
A.0 bits/use
B.0.5 bits/use
C.1 bit/use
D.Infinite
Correct Answer: 1 bit/use
Explanation:For a noiseless binary channel, the error probability . bit per channel use.
Incorrect! Try again.
36The Shannon-Hartley Law for the capacity of a Gaussian channel with bandwidth and signal-to-noise ratio is:
A.
B.
C.
D.
Correct Answer:
Explanation:This is the fundamental limit theorem for the capacity of a continuous-time Additive White Gaussian Noise (AWGN) channel.
Incorrect! Try again.
37According to the Shannon-Hartley law, if the Bandwidth tends to infinity, the capacity :
A.Becomes infinite
B.Approaches zero
C.Approaches a finite limit proportional to Signal Power
D.Oscillates
Correct Answer: Approaches a finite limit proportional to Signal Power
Explanation:As , . It does not go to infinity because the noise power also increases with bandwidth.
Incorrect! Try again.
38The Shannon limit for reliable communication states that the minimum required is:
A.0 dB
B.-1.6 dB ()
C.3 dB
D.10 dB
Correct Answer: -1.6 dB ()
Explanation:The absolute minimum energy per bit to noise power spectral density ratio is (approx 0.693), which corresponds to -1.59 dB (often rounded to -1.6 dB).
Incorrect! Try again.
39The trade-off between Bandwidth and SNR in the Shannon-Hartley law implies that:
A.Bandwidth and SNR are linearly exchangeable
B.Bandwidth can be traded for SNR logarithmically
C.No trade-off is possible
D.Increasing Bandwidth always reduces Capacity
Correct Answer: Bandwidth can be traded for SNR logarithmically
Explanation:To maintain the same capacity, a reduction in SNR (power) requires a much larger (exponential) increase in Bandwidth, or conversely, a small increase in bandwidth allows for a significant reduction in power.
Incorrect! Try again.
40Kraft's Inequality is a necessary and sufficient condition for the existence of:
A.A uniquely decodable prefix code
B.A channel with zero error
C.A Gaussian noise source
D.Infinite bandwidth
Correct Answer: A uniquely decodable prefix code
Explanation:Kraft's inequality relates the lengths of the codewords in a prefix code. If , a prefix code exists with lengths .
Incorrect! Try again.
41In Shannon-Fano coding, the first step is to:
A.Arrange symbols in decreasing order of probability
B.Arrange symbols in increasing order of probability
C.Assign 0 to all symbols
D.Calculate the cumulative distribution function
Correct Answer: Arrange symbols in decreasing order of probability
Explanation:Shannon-Fano coding starts by sorting symbols by probability (high to low) and then recursively splitting the list into two sets with approximately equal total probabilities.
Incorrect! Try again.
42Redundancy in a code is defined as:
A. (where is efficiency)
B.
C.
D.
Correct Answer: (where is efficiency)
Explanation:Redundancy measures the fraction of the code length that is not carrying unique information. It is the complement of efficiency.
Incorrect! Try again.
43The conditional entropy represents:
A.The average uncertainty about Y given X is known
B.The average uncertainty about X given Y is known
C.The joint uncertainty of X and Y
D.The information lost in the channel
Correct Answer: The average uncertainty about Y given X is known
Explanation:This is the definition of conditional entropy. It quantifies the remaining uncertainty of Y when X is observed.
Incorrect! Try again.
44If a channel has a bandwidth of 3000 Hz and an SNR of 30 dB, the approximate capacity is:
A.30,000 bits/sec
B.3,000 bits/sec
C.9,965 bits/sec
D.1,000 bits/sec
Correct Answer: 30,000 bits/sec
Explanation:Use Shannon-Hartley: . . bps.
Incorrect! Try again.
45Which noise source is associated with the discrete nature of charge carriers crossing a barrier?
A.Thermal noise
B.Shot noise
C.Flicker noise
D.Cosmic noise
Correct Answer: Shot noise
Explanation:Shot noise arises in devices like diodes and transistors due to the random fluctuations in the arrival time of discrete electrons crossing a potential barrier.
Incorrect! Try again.
46Flicker noise, or 1/f noise, becomes dominant at:
A.Very high frequencies
B.Low frequencies
C.Microwave frequencies
D.Temperatures near absolute zero
Correct Answer: Low frequencies
Explanation:Flicker noise power spectral density is inversely proportional to frequency (), making it significant primarily at low frequencies.
Incorrect! Try again.
47The correlation between the in-phase component and the quadrature component of narrow band noise at the same time instant is:
A.0 (Uncorrelated)
B.1 (Fully correlated)
C.Infinite
D.0.5
Correct Answer: 0 (Uncorrelated)
Explanation:For stationary narrow band noise symmetric about the center frequency, the in-phase and quadrature components are uncorrelated at the same time instant ().
Incorrect! Try again.
48If the Noise Figure of an amplifier is 3 dB, what is the output SNR if the input SNR is 20 dB?
A.23 dB
B.17 dB
C.60 dB
D.6.66 dB
Correct Answer: 17 dB
Explanation:In dB, . So, . dB.
Incorrect! Try again.
49What is the entropy of a deterministic event (Probability = 1)?
A.1 bit
B.0 bits
C.Infinite
D.0.5 bits
Correct Answer: 0 bits
Explanation:A deterministic event has no uncertainty. .
Incorrect! Try again.
50The quantity is known as:
A.Joint Entropy
B.Conditional Entropy
C.Mutual Information
D.Differential Entropy
Correct Answer: Joint Entropy
Explanation: measures the average uncertainty associated with the pair of random variables . .
Incorrect! Try again.
Give Feedback
Help us improve by sharing your thoughts or reporting issues.