Adam Panagos / Engineer / Lecturer

###### Applied Linear Algebra: Eigenvalues and Eigenvectors

###### An eigenvector of the matrix A is a vector v such that Av = Lv. We provide several examples of computing eigenvalues and eigenvectors, as well as how they can be used to diagonalize a matrix.v

#### Checking for an Eigenvector

9/5/2016

Running Time: 4:51

An eigenvector of the matrix A is a vector v such that Av = Lv. In other words, when the matrix A "acts on" the vector v, the result is the same vector v just scaled by the number L. The number L is the eigenvalue associated with the eigenvector v.

In this video we are given a matrix A and vectors v1 and v2. We compute Av1 and Av2 to see if they result in a Lv1 or Lv2, i.e. to check if they are eigenvectors or not.

#### Basis for an Eigenspace

9/5/2016

Running Time: 7:48

An eigenvector of a matrix is a vector v that satisfies Av = Lv. In other words, after A "acts on" the vector v, the result is the vector v just scaled by the constant number L.

In this example, we find the eigenvectors of a given 3x3 matrix. This is done by finding the null space of the matrix A-LI. The null space solution (A-LI)x = 0 always results in an infinite number of solutions for the vector x. As such, we find a basis for each one of these solutions, and thus the "basis for an eigenspace" terminology.

#### Eigenvalue and Eigenvector Computations Example

12/23/2013

Running Time: 16:39

Eigenvalues and eigenvectors play an important role in many signal processing applications. This example demonstrates the mechanics of computing the eigenvalues and eigenvectors of a specific 3x3 matrix. Eigenvalues are always the roots of the matrix characteristic equation, i.e. det(A-LI), while eigenvectors are found by finding the null space of the equation A-LI for each eigenvalue L.

#### Eigenvalue Computation #1

9/5/2016

Running Time: 4:13

This video provides another simple example of computing eigenvalues of a matrix. First, we compute the characteristic polynomial of the matrix by computing det(A-LI). Then, the roots of the characteristic polynomial are found by finding the solution det(A-LI) = 0. The solutions to this equation are by definition the eigenvalues of the matrix.

#### Eigenvalue Computation #2

10/25/2016

Running Time: 6:14

This video provides another simple example of computing eigenvalues of a matrix. First, we compute the characteristic polynomial of the matrix by computing det(A-LI). Then, the roots of the characteristic polynomial are found by finding the solution det(A-LI) = 0. The solutions to this equation are by definition the eigenvalues of the matrix.

#### Eigenvalue Computation #3

10/25/2016

Running Time: 4:32

This video provides an example of computing the eigenvalues of an upper triangular matrix. Since the determinant of a triangular matrix is just the product of its diagonal elements, computing the characteristic polynomial and finding its roots for the 7x7 matrix in this problem is actually quite simple.

#### Computing a Matrix to a Power

10/25/2016

Running Time: 7:36

In this example we compute A^5 where A is a 2x2 matrix. Instead of performing a "brute force" computation we use the matrix diagonalization A = PDP^-1 to perform the computation. We see that in general, the matrix diagonalization can greatly simplify the computation for matrices raised to large powers.

#### Diagonalizing a Matrix

10/25/2016

Running Time: 4:41

The previous video on this playlist used the matrix diagonalization A = PDP^1 to simplify a computation. In this video we show how to diagonalize A by explicitly constructing the matrices P and D. If A has unique eigenvalues, D should be a diagonal matrix with eigenvalues along the diagonal while P will have eigenvectors as its columns.

#### The Eigenvalue Power Method Example #1

7/2/2018

Running Time: 9:00

The "power method" is a numerical algorithm for approximating the largest eigenvalue of a matrix. This algorithm works best when there is a "dominant" eigenvalue of the matrix. After making an initial guess, the algorithm performs an iterative computation that results in a sequence of values that converge to the largest eigenvalue. The vector that is updated in each iteration of the algorithm also converges to the eigenvector associated with the largest eigenvalue.

This video outlines the steps of this algorithm and then works two specific examples of the eigenvalue power method to demonstrate how the algorithm works.