11 - Principal Component Analysis and Autoencoders

Class: CSCE-421


Notes:

We will use linear algebra only!

Matrix times Vector - 2 Ways

image-43.png

At first, you learn (Mv1). But when you get used to viewing it as (Mv2), you can understand Ax as a linear combination of the columns of A. Those products fill the column space of A denoted as C(A). The solution space of Ax=0 is the nullspace of A denoted as N(A).

Notes:

The Art of Linear Algebra - Vector times matrix - 2 Ways

image-44.png432

A row vector y is multiplied by the two column vectors of A and become the two dot product elements of yA.

image-45.png428x154

The product yA is a linear combination of the row vectors of A.

Notes:

Matrix times Matrix - 4 ways

image-46.png

Notes:

Practical Patterns

image-47.png

Burn this into your memories and you can see ...

Notes:

image-48.png606

Notes:

Orthogonal Matrices

(1) An orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors, i.e., orthonormal vectors. That is, if a matrix Q is an orthogonal matrix, we have

QTQ=QQT=I.

(2) It leads to Q1=QT, which is a very useful property as it provides an easy way to compute the inverse.

(3) For an orthogonal n×n matrix Q=[q1,q2,,qn], where qiRn, i=1,2,,n, it is easy to see that qiTqj=0 when ij and qiTqi=1.

(4) Furthermore, suppose Q1=[q1,q2,,qi] and Q2=[qi+1,qi+2,,qn], we have Q1TQ1=I,Q2TQ2=I, but Q1Q1TI,Q2Q2TI.

Notes:

Notes:

Eigen-Decomposition

(1) A square n×n matrix S with n linearly independent eigenvectors can be factorized as

S=QΛQ1

where Q is the square n×n matrix whose columns are eigenvectors of S, and Λ is the diagonal matrix whose diagonal elements are the corresponding eigenvalues.

(2) Note that only diagonalizable matrices can be factorized in this way.

(3) If S is a symmetric matrix, its eigenvectors are orthogonal. Thus Q is an orthogonal matrix and we have

S=QΛQT.

Notes:

Question: