Unit 3 - Notes
ECE180
Unit 3: Operation on One Random Variable
1. Expected Value of a Random Variable
The expected value (or expectation) is the fundamental operator in signal processing and probability. It represents the "center of gravity" or the weighted average of a random variable.
1.1 Definition
If is a random variable, the expected value, denoted by or or , is defined as:
-
Discrete Case:
(Where is the Probability Mass Function) -
Continuous Case:
(Where is the Probability Density Function) -
Condition for Existence: The expectation exists only if the integral (or sum) converges absolutely: .
1.2 Properties of Expectation
- Constant: , where is a constant.
- Linearity: .
- Additivity: .
2. Function of a Random Variable
Often, we deal with a random variable that is a function of another random variable , such that . To find the expected value of , we do not necessarily need to find the PDF of first. We can use the Law of the Unconscious Statistician (LOTUS).
2.1 The LOTUS Theorem
-
Discrete Case:
-
Continuous Case:
This theorem is powerful because it allows calculations entirely within the domain of .
3. Moments about the Origin
Moments describes the shape characteristics of a probability distribution function.
3.1 Definition
The -th moment of a random variable about the origin is denoted by and is defined as the expected value of the -th power of .
3.2 Notable Moments
- Zeroth Moment ():
(Area under the PDF curve). - First Moment ():
This is the Mean or DC component of the signal. - Second Moment ():
This represents the Mean Square Value or the average power of the random variable.
4. Central Moments
Central moments are moments calculated about the mean () rather than the origin (0). They describe the spread and shape of the distribution relative to its center.
4.1 Definition
The -th central moment, denoted by , is defined as:
4.2 Notable Central Moments
- Zeroth Central Moment ():
- First Central Moment ():
(The first central moment is always zero). - Second Central Moment ():
4.3 Relationship between Central and Raw Moments
Using binomial expansion on , we can relate to .
For the second moment (Variance):
5. Variance, Skewness, and Kurtosis
5.1 Variance ()
Variance measures the dispersion or "spread" of the random variable around the mean.
- Definition:
- Standard Deviation (): The square root of variance. It has the same units as .
- Properties:
5.2 Skewness (Third Central Moment)
Skewness measures the asymmetry of the PDF around the mean.
- Definition:
- Coefficient of Skewness ():
- Interpretation:
- : Symmetric distribution (e.g., Gaussian).
- : Positively skewed (Long tail to the right).
- : Negatively skewed (Long tail to the left).
5.3 Kurtosis (Fourth Central Moment)
Kurtosis measures the "tailedness" or "peakedness" of the distribution.
- Definition:
- Coefficient of Kurtosis ():
- Interpretation:
- Mesokurtic: (Normal/Gaussian distribution).
- Leptokurtic: (Sharper peak, heavy tails).
- Platykurtic: (Flatter peak, light tails).
6. Chebyshev’s Inequality
Chebyshev’s inequality provides a bound on the probability that a random variable deviates from its mean by a certain amount. It applies to any distribution (discrete or continuous) as long as the variance is finite.
6.1 Statement
For a random variable with mean and variance :
where is a real number.
6.2 Alternate Form
Interpretation: The probability that lies within standard deviations of the mean is at least .
- For : At least of data lies within of the mean.
- For : At least of data lies within of the mean.
7. Characteristic Function
The characteristic function is the Fourier Transform of the PDF (with a sign reversal in the exponent). It always exists for any random variable.
7.1 Definition
Denoted by :
7.2 Properties
- Maximum Value: .
- Origin: .
- Inversion Formula: We can recover the PDF from the characteristic function:
7.3 Moment Generation via Characteristic Function
We can find the -th moment () by differentiating :
8. Moment Generating Function (MGF)
The MGF is closely related to the Laplace transform. Unlike the characteristic function, the MGF does not exist for all distributions (the integral may diverge).
8.1 Definition
Denoted by :
8.2 Moment Generation Property
The primary utility of the MGF is to calculate moments easily. The -th derivative of evaluated at yields the -th moment about the origin.
8.3 Relationship to Characteristic Function
9. Transformations of a Random Variable
If we know the PDF of a random variable () and we define a new variable , we aim to find the PDF of ().
9.1 Method 1: Monotonic Transformations
If is a one-to-one function (monotonic increasing or decreasing), there is a unique solution .
Formula:
where must be replaced by .
Steps:
- Find the inverse relation .
- Calculate the derivative .
- Substitute and the derivative into the formula.
- Determine the valid range of .
9.2 Method 2: Non-Monotonic Transformations
If is not one-to-one (e.g., ), a single value of may correspond to multiple values of (roots). Let the roots be .
Formula:
Alternatively expressed as:
Example: Square Law Transformation ()
For , the roots are and .
For , .