Unit 4 - Notes

ECE180

Unit 4: Multiple Random Variables

1. Vector Random Variables

A Vector Random Variable (or a multidimensional random variable) is a vector function that maps outcomes from a sample space into an -dimensional Euclidean space .

  • Notation: where each is a random variable.
  • Significance: In many engineering problems, a single number is insufficient to describe the outcome of an experiment (e.g., measuring both the magnitude and phase of a signal, or the coordinates of a particle).
  • Events: An event is defined by a region in the -dimensional space. The probability of the event is .

2. Joint Distribution Function (CDF)

For two random variables and , the Joint Cumulative Distribution Function (CDF) is defined as the probability that and simultaneously.

Properties of Joint CDF

  1. Non-decreasing: is a non-decreasing function of both and .
  2. Bounds:
  3. Right-Continuous: The function is continuous from the right.
  4. Probability of a Rectangle: To find :

3. Joint Density Function (PDF)

For continuous random variables, the Joint Probability Density Function (PDF) is the partial derivative of the joint CDF.

Properties of Joint PDF

  1. Non-negative: for all .
  2. Normalization: The volume under the density surface is unity.
  3. CDF from PDF:
  4. Probability of a Region: The probability that the point falls within a specific region in the -plane is:

4. Marginal Distribution and Density Functions

Marginal functions allow us to obtain the properties of a single random variable from the joint probability model by "integrating out" or "summing out" the other variables.

Marginal CDF

  • Marginal CDF of X:
  • Marginal CDF of Y:

Marginal PDF

To find the density of alone, we integrate the joint density over the entire range of .

  • Marginal PDF of X:
  • Marginal PDF of Y:

5. Conditional Distribution and Density Functions

These functions describe the behavior of one random variable assuming the other has taken on a specific value.

Conditional PDF

The conditional PDF of given is defined as:

Similarly, for given :

Properties

  1. Normalization: . It behaves exactly like a standard single-variable PDF.
  2. Chain Rule of Probability: .

Conditional CDF


6. Statistical Independence

Two random variables and are statistically independent if the occurrence of an event associated with one variable has no influence on the probability of an event associated with the other.

Conditions for Independence

If and are independent, the following equivalent conditions hold:

  1. PDF Factorization:
  2. CDF Factorization:
  3. Conditional Equals Marginal:

Note: If the joint support (the region where ) is not a rectangle (e.g., a triangle or circle), the variables are dependent, even if the function looks separable.


7. Sum of Two Random Variables

Let . We seek the statistics of .

Using the PDF (Convolution)

If and are independent continuous random variables, the PDF of their sum is the convolution of their individual PDFs.

Alternatively:

Sum of Several Random Variables

If where all are independent:

  1. The PDF of is the convolution of the PDFs of all .
  2. Mean:
  3. Variance:

8. Central Limit Theorem (CLT)

The CLT states that the probability distribution of the sum (or average) of a large number of independent, identically distributed (i.i.d.) random variables approaches a Gaussian (Normal) distribution, regardless of the original distribution shape.

Statement

Let be independent random variables with mean and variance . Let .

As :

  1. approaches a Normal distribution.
  2. Mean of Sum:
  3. Variance of Sum:

The Normalized Form

The random variable :


approaches the Standard Normal Distribution as .


9. Expected Value of a Function of Random Variables

If is a function of two random variables and , the expected value is:

Linearity of Expectation

The expectation operator is linear:


This holds regardless of whether and are independent.


10. Joint Moments

Joint moments describe the statistical relationship between two variables.

Joint Moments about the Origin ()

The -th joint moment is defined as:

  • (Mean of X, )
  • (Mean of Y, )
  • : This is the Correlation of and .
    • If , and are orthogonal.

Joint Central Moments ()

These are moments taken about the respective means ( and ).

  • (Variance of X)
  • (Variance of Y)
  • : This is the Covariance, denoted or .

Covariance Formulas

  • If , the variables are Uncorrelated.
  • Relationship to Independence:
    • Independence Uncorrelated.
    • Uncorrelated Independence (except for Joint Gaussian RVs).

Correlation Coefficient ()

A normalized measure of linear dependence between and .

  • : Perfect positive linear relationship.
  • : Perfect negative linear relationship.
  • : Uncorrelated.