Unit 6 - Notes

ECE180

Unit 6: Noise Sources & Information Theory

1. Noise Sources and Characterization

Noise is defined as any unwanted electrical signal that interferes with the transmission, processing, or reception of information signals.

1.1 Resistive / Thermal Noise (Johnson-Nyquist Noise)

Thermal noise is generated by the random thermal motion of electrons inside a resistor. It exists in all resistive components regardless of applied voltage.

  • Origin: Random collision of electrons with atoms due to thermal energy.

  • Characteristics: It is White Noise (constant power spectral density) up to extremely high frequencies (near optical).

  • Mean Value: The average voltage is zero ().

  • Variance (Mean Square Value):
    For a resistor at temperature (Kelvin) over a bandwidth (Hz):


    Where:

    • : Boltzmann’s constant ()
    • : Absolute temperature in Kelvin
    • : Resistance in Ohms
    • : Bandwidth in Hz
  • Circuit Models:

    1. Thevenin Equivalent: A noise voltage source in series with a noiseless resistor .
    2. Norton Equivalent: A noise current source in parallel with a noiseless resistor .
  • Maximum Power Transfer:
    The maximum noise power () delivered to a matched load () is:

  • Power Spectral Density (PSD):

    • Two-sided PSD: (V²/Hz)
    • One-sided PSD: (V²/Hz) for

1.2 Arbitrary Noise Sources

Apart from thermal noise, other common noise sources include:

  1. Shot Noise:
    • Arises in active devices (diodes, transistors) due to the discrete nature of charge carriers (electrons/holes) crossing a potential barrier.
    • Formula:
    • Where is electron charge and is the DC current.
  2. Flicker Noise (1/f Noise):
    • Dominant at low frequencies.
    • PSD is inversely proportional to frequency ().
  3. Burst Noise (Popcorn Noise):
    • Sudden step-like transitions in current/voltage levels in semiconductors.

1.3 Effective Noise Temperature ()

A convenient way to characterize the noise performance of a device or system.

  • Definition: The effective noise temperature of a device is the temperature at which a hypothetical resistor (equal to the input resistance of the device) would generate the same thermal noise power as the device adds internally.
  • Formula:

    Usually expressed as:

    Where is the power gain.

1.4 Noise Equivalent Bandwidth ()

Actual filters do not have perfectly rectangular frequency responses. equates an actual filter to an ideal rectangular filter.

  • Definition: The bandwidth of an ideal rectangular filter that passes the same noise power as the actual filter, assuming the maximum gain of both filters is the same.
  • Formula:
    • : Transfer function of the system.
    • : Maximum magnitude of the transfer function.

2. Noise Figure Characterization

2.1 Average Noise Figure ()

A figure of merit measuring the degradation of the Signal-to-Noise Ratio (SNR) as a signal passes through a system.

  • Definition: The ratio of the input SNR to the output SNR.
  • Since and , where is gain:
  • Ideally: (0 dB) for a noiseless system.
  • Relation to Noise Temperature:

    Where is the standard reference temperature (usually 290 K).

2.2 Average Noise Figure of Cascaded Networks (Friis Formula)

When amplifiers are connected in series (cascade), the total noise figure depends heavily on the first stage.

  • Scenario: Stage 1 (Gain , Noise Figure ) followed by Stage 2 (Gain , Noise Figure ), etc.
  • Formula:
  • Key Insight: The noise contribution of downstream stages is divided by the total gain of previous stages. Therefore, the first stage needs low noise () and high gain () to minimize the overall system noise figure.

3. Narrow Band Noise

3.1 Definition

Narrow band noise is noise whose spectral content is concentrated within a narrow bandwidth around a center frequency , such that . This is typical in bandpass filter outputs in communication receivers.

3.2 Quadrature Representation

Narrow band noise can be represented in terms of its in-phase and quadrature components:

  • : In-phase component (varies slowly).
  • : Quadrature component (varies slowly).

3.3 Properties of Narrow Band Noise

If is a zero-mean, stationary, Gaussian random process with PSD :

  1. Mean: .
  2. Variance: The power in the components equals the total noise power.
  3. PSD Relationship: The PSD of and is related to the low-pass equivalent of the bandpass PSD .
  4. Statistical Independence: At the same time instant , and are uncorrelated (and independent if Gaussian).
  5. Envelope and Phase:
    Representing in polar form: .
    • Envelope : Follows a Rayleigh Distribution.
    • Phase : Follows a Uniform Distribution over .

4. Information Theory Fundamentals

4.1 Information Content (Self-Information)

The amount of information gained by the occurrence of an event is inversely related to its probability.

  • Formula: If an event has probability :
  • Properties:
    • If (certainty), (no information gained).
    • If (rare event), (high information).

4.2 Entropy ()

Entropy is the average information content per symbol of a source. It represents the uncertainty of the source.

  • Formula:
  • Properties:
    • .
    • Max Entropy: Occurs when all symbols are equiprobable ().

4.3 Information Rate ()

The rate at which information is generated by the source.


Where is the signaling rate (symbols per second).


5. Source Coding

The objective of source coding is to represent the source output with the minimum number of bits (Data Compression).

  • Code Length (): The number of bits assigned to a symbol.
  • Average Code Length ():
  • Coding Efficiency ():
  • Redundancy:

5.1 Shannon-Fano Coding

A top-down approach to coding.

  1. List source symbols in decreasing order of probability.
  2. Partition the set into two subsets with approximately equal total probabilities.
  3. Assign '0' to the upper set and '1' to the lower set.
  4. Repeat until subsets contain only one symbol.

5.2 Huffman Coding

A bottom-up approach (Optimal prefix code).

  1. List source symbols in decreasing order of probability.
  2. Combine the two symbols with the lowest probabilities into a new composite symbol with probability equal to their sum.
  3. Reorder the list.
  4. Repeat until probability sums to 1.0.
  5. Trace back the tree to assign bits (0 for upper branch, 1 for lower branch).
    • Note: Huffman coding guarantees the lowest possible average code length for symbol-by-symbol coding.

6. Channel Capacity and Shannon-Hartley Law

6.1 Mutual Information

Measures the amount of information the output contains about the input .

  • Formula:
    • : Source Entropy (Uncertainty before reception).
    • : Equivocation (Uncertainty remaining after reception).
  • Channel Capacity (): The maximum possible mutual information over all possible input probability distributions.

6.2 Capacity of Discrete Channels

  1. Lossless Channel: . .
  2. Deterministic Channel: . .
  3. Binary Symmetric Channel (BSC):
    With error probability :

6.3 Shannon-Hartley Law (Continuous Channel Capacity)

Defines the maximum data rate (capacity) that can be transmitted over a continuous analog channel with bandwidth affected by Additive White Gaussian Noise (AWGN).

  • Formula:

    Where:
    • : Channel Bandwidth (Hz)
    • : Signal Power (Watts)
    • : Noise Power (Watts) within bandwidth ()
    • : Signal-to-Noise Ratio (linear scale, not dB).

6.4 Trade-off between Bandwidth and SNR

The Shannon-Hartley law implies that bandwidth and signal power can be exchanged to maintain a constant channel capacity.

  1. Bandwidth Limited: If is small, a very high SNR is required to achieve high capacity.
  2. Power Limited: If SNR is low, capacity can be maintained by increasing bandwidth .
  3. Limiting Case ():
    As bandwidth approaches infinity, capacity does not become infinite; it reaches a limit determined by the signal energy per bit () and noise spectral density ().

    The Shannon Limit: Reliable communication is impossible if is below (approx 0.693), or -1.6 dB.