Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley...

25
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
  • date post

    22-Dec-2015
  • Category

    Documents

  • view

    214
  • download

    0

Transcript of Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley...

Boot Camp in Linear Algebra

Joel BarajasKarla L CaballeroUniversity of California

Silicon Valley Center

October 8th, 2008.

Matrices A matrix is a rectangular array of numbers

(also called scalars), written between square brackets, as in

Vectors A vector is defined as a matrix with only

one column or row

Row vectorColumn vector or vector

Zero and identity matrices

The zero matrix (of size m X n) is the matrix with all entries equal to zero

An identity matrix is always square and its diagonal entries are all equal to one, otherwise are zero. Identity matrices are denoted by the letter I.

Vector Operations The inner product (a.k.a. dot product or

scalar product) of two vectors is defined by

The magnitude of a vector is

Vector Operations The projection of vector

y onto vector x is

where vector ux has unit magnitude and the same direction as x

Vector Operations The angle between vectors x and y is

Two vectors x and y are said to be orthogonal if xTy=0 orthonormal if xTy=0 and |x|=|y|=1

Vector Operations A set of vectors x1, x2, …, xn are said to be linearly

dependent if there exists a set of coefficients a1, a2, …, an (at least one different than zero) such that

A set of vectors x1, x2, …, xn are said to be linearly independent if

Matrix OperationsMatrix transpose If A is an m X n matrix, its transpose,

denoted AT, is the n X m matrix given by (AT )ij = Aji. For example,

Matrix OperationsMatrix addition Two matrices of the same size can be

added together, to form another matrix (of the same size), by adding the corresponding entries

Matrix OperationsScalar multiplication The multiplication of a matrix by a scalar

(i.e., number), is done by multiplying every entry of the matrix by the scalar

Matrix OperationsMatrix multiplication You can multiply two matrices A and B

provided their dimensions are compatible, which means the number of columns of A equals the number of rows of B. Suppose that A has size m X p and B has size p X n. The product matrix C = AB, which has size m X n, is defined by

Matrix Operations The trace of a square matrix Ad×d is the sum of its

diagonal elements

The rank of a matrix is the number of linearly

independent rows (or columns)

A square matrix is said to be non-singular if and only if its rank equals the number of rows

(or columns) A non-singular matrix has a non-zero determinant

Matrix Operations A square matrix is said to be orthonormal

if AAT=ATA=I For a square matrix A

if xTAx>0 for all x≠0, then A is said to be positive-definite (i.e., the covariance matrix)

if xTAx≥0 for all x≠0, then A is said to be positive-semidefinite

Matrix inverse If A is square, and there is a matrix F such that FA

= I, then we say that A is invertible or nonsingular.

We call F the inverse of A, and denote it A-1. We can then also define A-k = (A-1)k. If a matrix is not invertible, we say it is singular or noninvertible.

A n×n A−1n× n= A

−1n×n A n×n= I

[a11 a12a21 a22 ]

−1

=1

a11a22−a21 a12 [ a22 −a12−a21 a11 ]

Matrix Operations The pseudo-inverse matrix A† is typically

used whenever A-1 does not exist (because A is not square or A is singular):

Matrix Operations The n-dimensional space in which all the n-

dimensional vectors reside is called a vector space

A set of vectors {u1, u2, ... un} is said to form a basis for a vector space if any arbitrary vector x can be represented by a linear combination of the {ui}

Matrix Operations The coefficients {a1, a2, ... an} are called the

components of vector x with respect to the basis {ui}

In order to form a basis, it is necessary and sufficient that the {ui} vectors are linearly independent

Matrix Operations A basis {ui} is said to be orthogonal if

A basis {ui} is said to be orthonormal if

Linear Transformations A linear transformation is a mapping from a vector space XN

onto a vector space YM, and is represented by a matrix Given vector x∈XN, the corresponding vector y on YM is

computed as

A linear transformation represented by a square matrix A is said to be orthonormal when AAT=ATA=I

Eigenvectors and Eigenvalues Let A be any square matrix. A scalar is

called and eigenvalue of A if there exists a non zero vector v such that:

Av=v

Any vector v satisfying this relation is called and eigenvector of A belonging to the eigenvalue of

How to compute the Eigenvalues and the Eigenvectors• Find the characteristic polynomial (t) of A.• Find the roots of (t) to obtain the

eigenvalues of A.• Repeat (a) and (b) for each eigenvalue of A.

a. Form the matrix M=A-I by subtracting down the diagonal A.

b. Find the basis for the solution space of the homogeneous system MX=0. (These basis vectors are linearly independent eigenvectors of A belonging to .)

Example We have a matrix

The characteristic polynomial (t) of A is computed. We have

13

24=A

)+)(t(=t=Δ(t)

==A

==tr(A)

25-t103t

1064

314

2

Example Set (t)=(t-5)(t+2)=0. The roots 1=5 and 2=-2

are the eigenvalues of A. We find an eigenvector v1 of A belonging to the

eigenvalue 1=5

)(=v

=

y

x==MX

==M

λIA=M

2,1

0

0

63

210

63

21

50

05

13

24

1

Example We find the eigenvector v2 of A belonging to the

eigenvalue 2=-2

The system has only one independent solution then v2=(-1,3)

13

26

20

02

13

24 ==M