top of page
yt channel art banner.jpg
Orthogonality and Least Squares Part 1:
We provide the definition of an inner product and examine its properties. Several examples of the dot product, norm, and how to compute orthogonal vectors are also presented. Sets of orthogonal vectors and orthogonal projects are also presented.

Inner Product Introduction

4/15/2020

Running Time: 6:06

The inner product, defined on some vector space V, is a function that maps two vectors to a scalar. The inner product between vector x and y is denoted {x,y}. The inner product has 4 properties that we discuss. For the special case where the vector space is Rn, the inner product is the dot product. For this special case, the dot product between vectors x and y is simply xy^T (i.e. the product of x with the transpose of y). We work an example of computing the dot product between two vectors.

Dot Product Properties (with proofs)

4/16/20

Running Time: 7:24

The dot product is a special case of an inner product for vector spaces on Rn. As such, the dot product has all properties of an inner product. We provide proofs for all 4 of these properties using direct proofs/algebraic manipulation.

The Norm of a Vector

4/15/2020

Running Time: 5:20

The norm of a vector provides a measure of vector size. For the case of vectors in Rn, this vector norm is the length of the vector and the vector norm is defined as the square-root of the dot product of the vector with itself. We show how this norm definition works for the case of vectors in Rn, and also provide an example of computing the norm of a vector. An example of finding a unit vector in a desired direction is also provided. This establishes a useful fact that a vector divided by its norm is always a unit vector.

The Distance Between Two Vectors

4/15/2020

Running Time: 4:42

The distance between vectors x and y is denoted d(x,y). This distance is just the norm of the vector x-y, i.e. ||x-y||. The distance between two vectors d(x,y) is also the same as d(y,x) since ||x-y|| = ||y-x||. We provide these basic definitions and then work an example where the distance between two vectors is computed for vectors in R3.

Definition of Orthogonal Vectors

4/27/2020

Running Time: 7:22

This video provides a definition of orthogonal vectors. Two vectors in Rn are orthogonal if their dot product is zero. We provide a graphical depiction of orthogonality and why the dot product must be zero using our previous definitions of distance. We also work a few examples where we determine if two provided vectors are orthogonal (or not) by computing their dot product.

Dot Product and Norm Examples

4/28/2020

Running Time: 4:42

This video works several examples of computing norms and dot products. In the previous video, we showed that the norm squared of a sum of vectors can be written as ||v+u||^2 = ||v||^2 + ||u||^2 + 2*dot(u,v). In this video, we work with specific values for u and v and compute the various norms and dot products in this expression to get practice doing these core computations. As a final computation, we also check that the equation from the previous video also holds true as expected.

The Parallelogram Law

4/29/2020

Running Time: 4:54

The Parallelogram Law is a simple equation that relates the lengths of the sides of a parallelogram to the lengths of the diagonals of a parallelogram. We derive this simple result by performing two different computations, one involves the norm of a sum of vectors, and the other computation involves the norm of a difference of vectors.

Finding Orthogonal Vectors in R2

4/30/2020

Running Time: 4:32

We derive a simple equation and provide a few examples of how orthogonal vectors can be easily computed in R2. Given a vector v = [v1; v2] in R2, the vector x can be constructed by simply interchanging the coordinates of v, and then negating one of the values. So, x is orthogonal to v for x = [v2; -v1] or x = [-v2; v1]. This establishes that there are always two vectors orthogonal to any given vector in R2.

Finding Orthogonal Vectors in R3

5/1/2020

Running Time: 6:27

We derive a simple equation and provide a few examples of how orthogonal vectors can be easily constructed in R3. Given a vector v = [v1; v2; v3] in R2, an infinite number of orthogonal vectors can be constructed. We provide a general expression for these orthogonal vectors, involving two free variables, and also show a specific example.

Orthogonal Complements

5/4/2020

Running Time: 6:00

Consider the subspace W. Let z be a vector that is orthogonal to every element of W. In this case, we say that z is orthogonal to W. Construct the set of all vectors that are orthogonal to W. We call this set the orthogonal complement of W. If z is an element of the orthogonal complement of W, it is orthogonal to every vector in W. If W = span(v1, v2, ..., vr), then z is in the orthogonal complement of W if it is orthogonal to v1, v2, ..., vr.

Orthogonal Sets of Vectors

5/5/2020

Running Time: 5:39

We've previously defined what it means for two vectors to be orthogonal. In this video we define an orthogonal set of vectors. This is simply a collection of vectors where each vector is orthogonal to every other vector in the set. We provide several examples of sets of vectors and determine if they are orthogonal sets of vectors or not. Since these are vectors in R3, we simply compute the dot product of different pairs of vectors to see if all dot products are zero or non-zero.

Representing Vectors with an Orthogonal Basis

5/6/2020

Running Time: 6:00

We show an example of how to write a given vector as a linear combination of orthogonal basis vectors. The amount of each basis vector to include can be computed using a simple ratio of dot products. This computation tells as how much the given vector "projects" onto the basis vector. We'll formally define this projection operation in the next video. For now, we just show an example of how to perform the computation/projections.

Orthogonal Projection of Vectors

5/7/2020

Running Time: 8:38

This video defines what we mean by the orthogonal projection of a vector u onto some other vector y. This computation can be performed using dot products when working with vectors in Rn. The projection tells us how much of the vector u "lies along" the vector y. This projection operation is defined to ensure the resulting error vector is orthogonal to the projection. We draw several figures to get an intuitive feel of how this projection operation behaves, derive the equation for the projection operation that was used in the previous video, and also provide a full example.

bottom of page