RankSQL: Query Algebra and Optimization for Relational Top ...
Linear Algebra Review (with a Small Dose of Optimization)
-
Upload
constance-browning -
Category
Documents
-
view
66 -
download
4
description
Transcript of Linear Algebra Review (with a Small Dose of Optimization)
![Page 1: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/1.jpg)
Linear Algebra Review(with a Small Dose of Optimization)
Hristo PaskovCS246
![Page 2: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/2.jpg)
Outline
• Basic definitions• Subspaces and Dimensionality• Matrix functions: inverses and eigenvalue
decompositions• Convex optimization
![Page 3: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/3.jpg)
Vectors and Matrices
• Vector
• May also write
![Page 4: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/4.jpg)
Vectors and Matrices
• Matrix
• Written in terms of rows or columns
![Page 5: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/5.jpg)
Multiplication
• Vector-vector:
• Matrix-vector:
![Page 6: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/6.jpg)
Multiplication
• Matrix-matrix:
=5
3
3
4 4
5
![Page 7: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/7.jpg)
Multiplication
• Matrix-matrix: – rows of , cols of
![Page 8: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/8.jpg)
Multiplication Properties
• Associative
• Distributive
• NOT commutative
– Dimensions may not even be conformable
![Page 9: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/9.jpg)
Useful Matrices
• Identity matrix
• Diagonal matrix
![Page 10: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/10.jpg)
Useful Matrices
• Symmetric • Orthogonal
– Columns/ rows are orthonormal• Positive semidefinite :
– Equivalently, there exists
![Page 11: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/11.jpg)
Outline
• Basic definitions• Subspaces and Dimensionality• Matrix functions: inverses and eigenvalue
decompositions• Convex optimization
![Page 12: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/12.jpg)
Norms
• Quantify “size” of a vector• Given , a norm satisfies
1. 2. 3.
• Common norms:1. Euclidean -norm: 2. -norm: 3. -norm:
![Page 13: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/13.jpg)
Linear Subspaces
![Page 14: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/14.jpg)
Linear Subspaces
• Subspace satisfies1. 2. If and , then
• Vectors span if
![Page 15: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/15.jpg)
Linear Independence and Dimension
• Vectors are linearly independent if
– Every linear combination of the is unique• if span and are linearly independent– If span then
• If then are NOT linearly independent
![Page 16: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/16.jpg)
Linear Independence and Dimension
![Page 17: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/17.jpg)
Matrix Subspaces
• Matrix defines two subspaces– Column space – Row space
• Nullspace of :
– Analog for column space
![Page 18: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/18.jpg)
Matrix Rank
• gives dimensionality of row and column spaces
• If has rank , can decompose into product of and matrices
=𝑚
𝑛
𝑚𝑛
𝑘
𝑘
![Page 19: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/19.jpg)
Properties of Rank
• For
• has full rank if • If rows not linearly independent– Same for columns if
![Page 20: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/20.jpg)
Outline
• Basic definitions• Subspaces and Dimensionality• Matrix functions: inverses and eigenvalue
decompositions• Convex optimization
![Page 21: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/21.jpg)
Matrix Inverse
• is invertible iff • Inverse is unique and satisfies
1. 2. 3. 4. If is invertible then is invertible and
![Page 22: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/22.jpg)
Systems of Equations
• Given wish to solve
– Exists only if • Possibly infinite number of solutions
• If is invertible then – Notational device, do not actually invert matrices– Computationally, use solving routines like
Gaussian elimination
![Page 23: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/23.jpg)
Systems of Equations
• What if ?• Find that gives closest to – is projection of onto – Also known as regression
• Assume
Invertible Projection matrix
![Page 24: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/24.jpg)
Systems of Equations
[1 02 1] [ .5−2.5]=[ .5−1.5] [−1 1
1 1][ −1− .5 ]=[ .5−1.5 ]
![Page 25: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/25.jpg)
Eigenvalue Decomposition
• Eigenvalue decomposition of symmetric is
– contains eigenvalues of – is orthogonal and contains eigenvectors of
• If is not symmetric but diagonalizable
– is diagonal by possibly complex– not necessarily orthogonal
![Page 26: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/26.jpg)
Characterizations of Eigenvalues
• Traditional formulation
– Leads to characteristic polynomial
• Rayleigh quotient (symmetric )
![Page 27: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/27.jpg)
Eigenvalue Properties
• For with eigenvalues 1. 2. 3.
• When is symmetric– Eigenvalue decomposition is singular value
decomposition– Eigenvectors for nonzero eigenvalues give
orthogonal basis for
![Page 28: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/28.jpg)
Simple Eigenvalue Proof
• Why ?• Assume is symmetric and full rank1. 2. 3. If , eigenvalue of is 04. Since is product of eigenvalues, one of the
terms is , so product is 0
𝑄𝑄𝑇=𝐼
![Page 29: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/29.jpg)
Outline
• Basic definitions• Subspaces and Dimensionality• Matrix functions: inverses and eigenvalue
decompositions• Convex optimization
![Page 30: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/30.jpg)
Convex Optimization• Find minimum of a function subject to solution
constraints• Business/economics/ game theory– Resource allocation– Optimal planning and strategies
• Statistics and Machine Learning– All forms of regression and classification– Unsupervised learning
• Control theory– Keeping planes in the air!
![Page 31: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/31.jpg)
Convex Sets
• A set is convex if and
– Line segment between points in also lies in • Ex– Intersection of halfspaces– balls– Intersection of convex sets
![Page 32: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/32.jpg)
Convex Functions
• A real-valued function is convex if is convex and and
– Graph of upper bounded by line segment between points on graph
(𝑥 , 𝑓 (𝑥 ) )
( 𝑦 , 𝑓 (𝑦 ) )
![Page 33: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/33.jpg)
Gradients
• Differentiable convex with • Gradient at gives linear approximation
𝑓
𝑥
𝑓 (𝑥 )+𝑤𝑇𝛻 𝑓
![Page 34: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/34.jpg)
Gradients
• Differentiable convex with • Gradient at gives linear approximation
𝑓
𝑥
𝑓 (𝑥 )+𝑤𝑇𝛻 𝑓
![Page 35: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/35.jpg)
Gradient Descent
• To minimize move down gradient– But not too far!– Optimum when
• Given , learning rate , starting point
Do until
![Page 36: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/36.jpg)
Stochastic Gradient Descent
• Many learning problems have extra structure
• Computing gradient requires iterating over all points, can be too costly
• Instead, compute gradient at single training example
![Page 37: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/37.jpg)
Stochastic Gradient Descent
• Given , learning rate , starting point Do until nearly optimal
For in random order
• Finds nearly optimal
![Page 38: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/38.jpg)
Minimize
![Page 39: Linear Algebra Review (with a Small Dose of Optimization)](https://reader035.fdocuments.net/reader035/viewer/2022062321/56812ceb550346895d91af00/html5/thumbnails/39.jpg)
Learning Parameter