Unit2 - Subjective Questions

INT255 • Practice Questions with Detailed Answers

1

Define Eigen decomposition. Explain the geometric interpretation of eigenvectors and eigenvalues in the context of linear transformations.

2

Discuss the primary limitations of Eigen decomposition when applied to general machine learning problems, particularly concerning matrix properties.

3

Define Singular Value Decomposition (SVD). List and briefly describe the components it decomposes a matrix into.

4

Explain the geometric intuition behind Singular Value Decomposition (SVD). How can it be visualized?

5

Derive the relationship between Singular Value Decomposition (SVD) and Eigen decomposition. Specifically, how can singular values and singular vectors be found through Eigen decomposition?

6

Explain the concept of low-rank approximation using SVD. How is it beneficial for data compression and noise reduction?

7

Describe the objective of Principal Component Analysis (PCA) from a geometric perspective. How does it achieve dimensionality reduction?

8

Derive the first principal component using an optimization approach, specifically by maximizing variance. Assume the data is centered.

9

Explain the relationship between PCA and SVD. How can SVD be used to perform PCA?

10

Discuss the criteria and methods for choosing the optimal number of principal components () in PCA for dimensionality reduction.

11

Compare and contrast Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA), highlighting their primary objectives and when each is preferred.

12

Explain the core idea behind Linear Discriminant Analysis (LDA) and describe the concepts of within-class scatter matrix () and between-class scatter matrix ().

13

Describe the steps involved in performing Linear Discriminant Analysis (LDA) for a classification task.

14

Discuss a significant limitation of Linear Discriminant Analysis (LDA) and explain why it can be problematic in certain real-world datasets.

15

Explain how matrix factorization is utilized in collaborative filtering for recommendation systems. Provide a high-level overview of the process.

16

Describe the advantages of using SVD-based matrix factorization for personalized recommendations, specifically mentioning the role of latent features.

17

Define the concept of an orthogonal matrix. Explain its significance in the context of Eigen decomposition and SVD.

18

Explain why Eigen decomposition is generally not suitable for dimensionality reduction of rectangular data matrices, and how SVD addresses this limitation.

19

Describe two real-world applications of matrix factorization techniques (beyond recommendation systems) in machine learning or data analysis.

20

When might PCA fail or provide suboptimal results, and what are its inherent limitations?