Unit 1 - Notes

ECE180

Unit 1: Probability and Random Variable

1. Fundamentals of Set Theory

Probability theory relies heavily on the mathematical framework of set theory.

Basic Definitions

  • Set: A collection of distinct objects.
  • Element: An object belonging to a set. Notation: (element belongs to set ).
  • Subset: If every element of is also in , then .
  • Universal Set ( or ): The set containing all objects under consideration. In probability, this corresponds to the Sample Space.
  • Empty (Null) Set (): A set containing no elements.

Set Operations

  • Union (): The set of elements in , or in , or in both. (Logical OR).
  • Intersection (): The set of elements common to both and . (Logical AND).
  • Complement ( or ): The set of elements in that are not in .
  • Difference (): Elements in but not in . Equivalent to .

Algebraic Laws of Sets

  1. Commutative: ;
  2. Associative:
  3. Distributive:
  4. De Morgan’s Laws:

2. Experiments and Sample Spaces

Experiments

An experiment is a process that produces an outcome.

  • Deterministic Experiment: An experiment where the outcome can be predicted with certainty (e.g., Ohm’s law verification).
  • Random (Stochastic) Experiment: An experiment where the outcome cannot be predicted with certainty, even if the experiment is repeated under identical conditions (e.g., tossing a coin, measuring noise voltage).

Sample Space ()

The set of all possible outcomes of a random experiment.

  • Each outcome is a sample point.
  • Example: Rolling a die, .

Types of Sample Spaces

  1. Discrete Sample Space: Contains a finite or countably infinite number of sample points.
    • Example: Tossing a coin () or counting the number of calls arriving at a switch (0, 1, 2...).
  2. Continuous Sample Space: Contains an uncountably infinite number of sample points (a continuum).
    • Example: Measuring the exact time of arrival of a signal () or measuring temperature.

3. Events

An Event is a subset of the sample space .

  • Simple Event: An event containing only one sample point.
  • Compound Event: An event containing more than one sample point.
  • Sure Event: The sample space itself (probability = 1).
  • Impossible (Null) Event: The empty set (probability = 0).

Relationships Between Events

  • Mutually Exclusive (Disjoint) Events: Two events and are mutually exclusive if they cannot occur simultaneously.
  • Exhaustive Events: A collection of events is exhaustive if their union equals the sample space.

4. Probability Definitions and Axioms

1. Classical (A Priori) Definition

Based on the assumption of equally likely outcomes. If an experiment has total outcomes and outcomes favorable to event :


Limitation: Only valid for finite sample spaces with equally likely outcomes.

2. Relative Frequency (Empirical) Definition

Based on repeating an experiment times. If event occurs times:


Limitation: Cannot perform an experiment infinite times; statistical regularity is assumed.

3. Axiomatic Definition (Kolmogorov's Axioms)

Let be a sample space. Probability is a function that assigns a real number to every event , satisfying:

  • Axiom 1 (Non-negativity): for any event .
  • Axiom 2 (Normalization): .
  • Axiom 3 (Additivity): If and are mutually exclusive (), then:

    (This extends to infinite sequences of mutually exclusive events).

5. Mathematical Model of Experiments

A rigorous mathematical model for a random experiment consists of a triplet , often called a Probability Space.

  1. Sample Space (): The set of all outcomes.
  2. Sigma-Algebra / Field (): A collection of subsets (events) of that is closed under complementation and countable unions. This defines the "measurable" events.
  3. Probability Measure (): A function satisfying the three axioms listed above.

6. Joint, Conditional, and Total Probability

Joint Probability

The probability that two events and occur simultaneously.

Conditional Probability

The probability of event occurring given that event has already occurred (where ).


Property: .

Total Probability Theorem

If is a set of mutually exclusive and exhaustive events (a partition of ), then for any event :


Using conditional probability:

Concept: This breaks a complex probability into smaller, manageable conditional parts based on the partition .


7. Bayes' Theorem and Independence

Bayes' Theorem

Used to find the "reverse" probability. If we know , Bayes' theorem allows us to find . Using the partition from the Total Probability Theorem:

Expanded form:

  • is the Prior probability.
  • is the Posterior probability.

Independent Events

Two events and are statistically independent if the occurrence of one does not affect the probability of the other.

  • Condition:
  • Equivalent: and

Note: Do not confuse "Mutually Exclusive" with "Independent". Mutually exclusive events are highly dependent (if one happens, the other cannot).


8. Bernoulli's Trials

A Bernoulli trial is a random experiment with exactly two possible outcomes: "Success" and "Failure".

Properties:

  1. There are only two outcomes.
  2. The probability of success is .
  3. The probability of failure is .
  4. Trials are independent.

Binomial Probability Law:
The probability of obtaining exactly successes in independent Bernoulli trials is:


Where .


9. The Random Variable (RV)

Definition

A Random Variable is a real-valued function that maps every outcome in the sample space to a real number.


Example: Tossing a coin twice. . Let be the number of heads.
.

Conditions for a Function to be a Random Variable

For a function to be a valid random variable:

  1. It must be single-valued (each outcome maps to exactly one number).
  2. For every real number , the set must be an event (i.e., it must belong to the field so that we can calculate its probability).
  3. and .

10. Types of Random Variables

1. Discrete Random Variable

  • Definition: An RV that takes on a countable number of distinct values (finite or countably infinite).
  • Range:
  • Description: Described by a Probability Mass Function (PMF), denoted .
  • Example: Outcome of a die roll, number of defective items in a batch.

2. Continuous Random Variable

  • Definition: An RV that takes on an infinite number of values within a continuous interval (uncountable).
  • Range: An interval on the real line, e.g., or .
  • Description: Described by a Probability Density Function (PDF), denoted .
    • Probability at a specific point is 0: .
    • Probability is calculated over intervals: .
  • Example: Noise voltage, temperature, height of students.

3. Mixed Random Variable

  • Definition: An RV for which the Cumulative Distribution Function (CDF) exhibits both jump discontinuities (discrete behavior) and continuous growth intervals.
  • It contains outcomes that occur with non-zero probability (discrete points) and ranges where probability is spread continuously.
  • Representation: The density function involves Dirac delta functions () at the discrete points.
  • Example: The waiting time at a traffic light (probability of waiting 0 seconds is non-zero if the light is green, but if red, the wait time is continuous).