Unit 1 - Notes
Unit 1: Linear Algebra
1. Review of Matrices
Definition
A matrix is a rectangular array of numbers (real or complex) arranged in rows and columns, enclosed by brackets.
where is the number of rows and is the number of columns.
Important Types of Matrices
- Square Matrix: Number of rows equals number of columns ().
- Diagonal Matrix: A square matrix where all non-diagonal elements are zero ( for ).
- Identity (Unit) Matrix (): A diagonal matrix where all diagonal elements are 1.
- Null (Zero) Matrix (): All elements are zero.
- Upper Triangular: All elements below the main diagonal are zero.
- Lower Triangular: All elements above the main diagonal are zero.
- Symmetric Matrix: (Elements are symmetric about the main diagonal).
- Skew-Symmetric Matrix: (Diagonal elements must be zero).
- Orthogonal Matrix: . (Implying ).
2. Elementary Operations of Matrices
Elementary operations are fundamental manipulations used to find the rank, inverse, or solve linear systems. They do not alter the order of the matrix.
Elementary Row Operations (ERO)
There are three valid operations:
- Interchange: Interchanging any two rows ().
- Scaling: Multiplying a row by a non-zero constant ().
- Addition: Adding a multiple of one row to another ().
Note: Similar operations apply to columns (Column Operations), but row operations are standard for Gaussian elimination.
3. Rank of a Matrix
Definition
The rank of a matrix , denoted as or , is the order of the largest non-vanishing (non-zero) minor of the matrix.
- Minor: The determinant of a square sub-matrix.
- If , then at least one minor of order is non-zero, and every minor of order is zero.
Methods to Find Rank
1. Echelon Form Method (Recommended)
Transform the matrix using EROs to an Upper Triangular form.
- All zero rows must be at the bottom.
- The number of leading zeros in a lower row is greater than the row above it.
- Result: Number of non-zero rows in Echelon form.
2. Normal Form (Canonical Form)
Reduce the matrix using both row and column operations to the form:
Here, is the identity matrix of order .
- Result: .
4. Linear Dependence and Independence of Vectors
Consider a set of vectors .
Consider the linear combination:
where are scalars.
Linear Independence
The vectors are Linearly Independent if the equation above holds only when all scalars are zero:
Linear Dependence
The vectors are Linearly Dependent if there exists a set of scalars, not all zero, such that the relation holds. One vector can be expressed as a linear combination of the others.
Testing Dependence via Matrices
Form a matrix where the vectors are rows (or columns).
- Calculate the determinant (if square). If , vectors are dependent. If , vectors are independent.
- Find the Rank.
- If number of vectors, they are Independent.
- If number of vectors, they are Dependent.
5. Solution of Linear System of Equations
A system of linear equations can be written in matrix form as:
- : Coefficient matrix
- : Column matrix of variables (unknowns)
- : Column matrix of constants
Case 1: Non-Homogeneous System ()
Form the Augmented Matrix .
Let be the number of unknowns.
Rouche-Capelli Theorem:
- Inconsistent (No Solution):
- Consistent (Solution exists):
- Unique Solution: If (Rank equals number of variables).
- Infinite Solutions: If . (The system has free variables/parameters).
Case 2: Homogeneous System ()
System is . This system is always consistent (has at least the trivial solution ).
- Trivial Solution (Zero solution only):
If (or ). - Non-Trivial Solution (Infinite non-zero solutions):
If (or ).
6. Inverse of Matrices
The inverse of a square matrix exists only if is non-singular (). It is denoted as such that .
Gauss-Jordan Method (Elementary Operation Method)
This is an algorithmic approach suitable for engineering computations.
- Write the augmented matrix .
- Apply Row Operations to reduce to the Identity matrix . The same operations apply to on the right side.
- The matrix obtained on the right side is the inverse.
7. Eigenvalues and Eigenvectors
Definitions
Let be a square matrix of order .
If there exists a scalar and a non-zero column vector such that:
or
Then:
- is called the Eigenvalue (Characteristic root / Latent root).
- is called the Eigenvector corresponding to .
Characteristic Equation
To find eigenvalues, the system must have a non-trivial solution (since ). This implies the determinant of the coefficient matrix must be zero:
This polynomial equation in is called the Characteristic Equation.
Procedure
- Form the Characteristic Equation .
- Solve the equation to find the roots . These are the eigenvalues.
- For each eigenvalue , substitute it back into and solve for using matrix row reduction.
8. Properties of Eigenvalues
These properties are essential for simplifying calculations. Let be a square matrix with eigenvalues .
- Sum Property: Sum of eigenvalues equals the Trace of (Sum of main diagonal elements).
- Product Property: Product of eigenvalues equals the Determinant of .
- Triangular/Diagonal Matrices: The eigenvalues of a triangular or diagonal matrix are exactly the diagonal elements themselves.
- Inverse: If is an eigenvalue of , then is an eigenvalue of (provided ).
- Power: If is an eigenvalue of , then is an eigenvalue of (where is a positive integer).
- Scalar Multiplication: If is an eigenvalue of , then is an eigenvalue of .
- Symmetric Matrices: The eigenvalues of a real symmetric matrix are always Real.
- Similar Matrices: If and are similar matrices, they have the same eigenvalues.
9. Cayley-Hamilton Theorem
Statement
"Every square matrix satisfies its own characteristic equation."
If the characteristic equation of a square matrix is:
Then, replacing with the matrix and the constant term with :
(Where $0$ represents the Null Matrix).
Applications of Cayley-Hamilton Theorem
-
To Find the Inverse ():
Multiply the matrix equation by :
-
To Find Higher Powers ():
If you need to calculate a high power (e.g., ) without multiplying eight times, you can use the characteristic equation to express in terms of lower powers of .
Verification Steps
- Find Characteristic Equation in terms of .
- Substitute for in the equation.
- Calculate powers of (, etc.).
- Perform matrix addition/subtraction according to the equation.
- Verify the result is the Null Matrix.