top of page
yt channel art banner.jpg
Applied Linear Algebra: Matrices
We now examine matrices. We first examine the basics of matrix-multiplication and the row-column rule. Then, we examine how to invert a matrix, compute the determinant of a matrix, and examine a variety of determinant properties. Finally, we discuss a way of solving systems of linear equations using Cramer's Rule.

Matrix Multiplication #1

3/30/2015

Running Time: 5:16

This video works a simple example where the 2x3 matrix A and 3x2 matrix B are multiplied to yield the 2x2 matrix AB.This example uses the standard "row-column" rule for computing each entry of the matrix product.

Matrix Multiplication #2

3/31/2015

Running Time: 4:40

We consider the same matrix multiplication problem as in the previous video but we think about the row-column product rule for multiplying matrices in a slightly different way.Consider the mxn matrix A and the nxp matrix B. The matrix product AB is an mxp matrix given by:AB = [Ab1 Ab2 ... Abp]where b1, b2, ..., bp are the nx1 column vectors of the matrix B. To compute the matrix product AB one simply has to compute "p" matrix-vector products to forum each column of the final matrix product.

Matrix Inversion #1

4/1/2015

Running Time: 5:05

This videos works an example of computing the inverse of a 2x2 matrix. We use a general approach that can be used to compute the inverse of any square matrix. We form the augmented matrix [A I] where I is the appropriately sized identity matrix. Then, we perform row operations to manipulate the augmented matrix into the form [I B], i.e. we want an identity matrix in the first half. By definition then, B = inv(A), the inverse that we are looking for.

Matrix Inversion #2

4/3/2015

Running Time: 4:35

This video works a simple example of computing the inverse of a 2x2 matrix. In the previous video we used a general algorithm for computing this inverse. The algorithm formed the augmented matrix [A I] then used row reduction to manipulate it into the form [I inv(A)]. In this video we work with the same matrix A, but we use a specific equation that can be used to compute the determinant of a 2x2 matrix.

Matrix Determinant Computation

4/6/2015

Running Time: 8:24

We compute the determinant of a 3x3 matrix using cofactor expansion about the first row (part a) and also the second column (part b). We use the general co-factor expansion equation for the matrix determinant. Since we're working with a 3x3 matrix in this example, the equations says our determinant can be written as a linear combination of weighted 2x2 determinants. The matrix determinant computation will be the same, regardless of the row or column chosen for the expansion. However, the computation can be sometimes be simplified by choosing a "nice" row or column. Often, the row or column with the most zeros is chosen since this eliminates many of the required computations (i.e. they're just zero). One just needs to be careful to make sure the alternate signs correctly by carefully evaluating the (-1)^(i+j) term within the determinant equation.

Matrix Determinant Properties Example Example #1

4/7/2015

Running Time: 5:53

Consider the square matrices A and B, where B can be obtained from A by interchanging the rows of A. If det(A) is known, det(B) can be easily computed by counting the number of row interchanges. Each time a row is interchanged the sign of det(A) changes. This video shows two computational examples that demonstrate this matrix determinant property. This video does not PROVE this general result, it just demonstrates the property.

Matrix Determinant Properties Example Example #2

6/10/2015

Running Time: 5:18

Consider the square matrices A and B, where B can be obtained from A by replacing one row of A with k times the row. If det(A) is known, det(B) can be easily computed by just multiplying by k, i.e. det(B) = k*det(A). Each time a row is multiplied by a constant k, the det(A) changes by a factor of k. This video shows a computational example that demonstrate this matrix determinant property. This video does not PROVE this general result, it just demonstrates the property.

Matrix Determinant Properties Example Example #3

8/1/2015

Running Time: 3:34

Consider the square matrices A and B, where B is the same as A except one row has been been replaced with a linear combination of rows from A. In this case, we have that det(A) = det(B). In general, replacing a row of a matrix with a linear combination of other rows does not change the value of the matrix determinant.This video does not prove this result, but demonstrates the property with a simple example.

Matrix Determinant Computation #2 (4x4)

8/1/2015

Running Time: 6:24

We compute the determinant of a 4x4 matrix in this video. Based on the value of the determinant we also determine if the matrix is invertible or not.In general, a square matrix is invertible if it's determinant is not equal to zero.

Cramer's Rule Example 3x3

8/1/2015

Running Time: 5:36

We work with a system of 3 equations and 3 unknowns in this example and use Cramer's Rule to solve the system. Cramer's Rule can be used to individually compute the unknown variables one at a time by computing determinants and taking the appropriate determinant ratios.

Adjugate Matrix Computation 3x3

8/1/2015

Running Time: 6:20

Given the square matrix A, the adjugate matrix (sometimes called the classical adjoint matrix) of A can be computed. We denote this matrix as adj(A). The entries of the adjugate matrix consist of the cofactors of the original matrix.Be careful with ordering the entries, the (i,j) entry of the adj(A) is actually the (j,i) cofactor.In particular, this video example computes the adjugate matrix of a 3x3 matrix.

Area Of A Parallelogram Using Determinants

8/1/2015

Running Time: 5:09

We use determinants to compute the area of a parallelogram in this example video. We also verify that the determinant approach to computing area yield the same answer obtained using "conventional" area computations.While this particular example is for a 2-D parallelogram, the same concept applies to larger dimensions (e.g. 3-D parallelpipeds, etc.).

bottom of page