Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books...

68
Machine Learning and Pattern Recognition A High Level Overview Prof. Anderson Rocha (Main bulk of slides kindly provided by Prof. Sandra Avila) Institute of Computing (IC/Unicamp) MC886/MO444 REC D reasoning for complex data

Transcript of Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books...

Page 1: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Machine Learning andPattern Recognition

A High Level Overview

Prof. Anderson Rocha(Main bulk of slides kindly provided by Prof. Sandra Avila)

Institute of Computing (IC/Unicamp)

MC886/MO444

REC Dreasoning for complex data

Page 2: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Why is Dimensionality Reduction useful?

● Data Compression

○ Reduce time complexity: less computation required

○ Reduce space complexity: less number of features

○ More interpretable: it removes noise

● Data Visualization

● To mitigate “the curse of dimensionality”

Page 3: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

Page 4: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

Even a basic 4D hypercube is incredibly hard to picture in our mind.

Page 5: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

Page 6: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

As the dimensionality of data grows, the density of observations becomes lower and lower and lower.

10 images1 dimension: 5 regions

Page 7: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

As the dimensionality of data grows, the density of observations becomes lower and lower and lower.

10 images2 dimensions: 25 regions

Page 8: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

As the dimensionality of data grows, the density of observations becomes lower

and lower and lower.

10 images3 dimensions: 125 regions

Page 9: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality

● 1 dimension: the sample density is 10/5 = 2 samples/interval

● 2 dimensions: the sample density is 10/25 = 0.4 samples/interval

● 3 dimensions: the sample density is 10/125 = 0.08 samples/interval

Page 10: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

The Curse of Dimensionality: Solution?

● Increase the size of the training set to reach a sufficient density of training instances.

● Unfortunately, the number of training instances required to reach a given density grows exponentially with the number of dimensions.

Page 11: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

How to reduce dimensionality?

● Feature Extraction: create a subset of new features by combining the existing ones.

○ z = f(x1, x2, x3, x4, x5)

● Feature Selection: choosing a subset of all the features (the ones more informative).

○ x1, x2, x3, x4, x5

Page 12: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA: Principal Component Analysis

Page 13: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Principal Component Analysis (PCA)

● The most popular dimensionality reduction algorithm.

● PCA have two steps:○ It identifies the hyperplane that lies closest to the data.○ It projects the data onto it.

Page 14: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Problem Formulation (PCA)

×

××

××

x1

x2

××

×××

Page 15: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Problem Formulation (PCA)

×

××

××

x1

x2

××

×××

Projection Error

Page 16: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Problem Formulation (PCA)

×

××

××

x1

x2

××

×××

Page 17: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Problem Formulation (PCA)

×

××

××

x1

x2

××

×××

×××××

Page 18: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Problem Formulation (PCA)

×

××

××

x1

x2

××

×××

×××××

Page 19: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Problem Formulation (PCA)

×

××

××

x1

x2

××

×××

×××××

Page 20: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Preserving the Variance

u(1

)

u(2

)

Page 21: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA AlgorithmBy Eigen Decomposition

Page 22: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Data Preprocessing

Training set: x(1), x(2), ..., x(m)

Preprocessing (feature scaling/mean normalization):

Replace each with .

Center the data

Page 23: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Data Preprocessing

Training set: x(1), x(2), ..., x(m)

Preprocessing (feature scaling/mean normalization):

Replace each with .

If different features on different scales, scale features to have comparable range of values.

Center the data

Page 24: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Data Preprocessing

Credit: http://cs231n.github.io/neural-networks-2/

Page 25: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

Reduce data from n-dimensions to k-dimensions

Compute “covariance matrix”:

n ⨉ n matrix

Page 26: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

Reduce data from n-dimensions to k-dimensions

Compute “covariance matrix”:

Covariance of dimensions x1 and x2:

● Do x1 and x2 tend to increase together?● or does x2 decrease as x1 increases?

n ⨉ n matrix

x1 x2x1 x2

Page 27: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Multiple a vector by 𝚺 :

Page 28: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Multiple a vector by 𝚺 :

Page 29: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Multiple a vector by 𝚺 :

Page 30: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Multiple a vector by 𝚺 :

Page 31: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Multiple a vector by 𝚺 :

Page 32: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Multiple a vector by 𝚺 :

Turns towards direction of variation

Page 33: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Want vectors u which aren’t turned: 𝚺u = 𝜆u

u = eigenvectors of 𝚺𝜆 = eigenvalues

Page 34: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

×

×

×

××

x1

x2

××

××u2 u1

Want vectors u which aren’t turned: 𝚺u = 𝜆u

u = eigenvectors of 𝚺𝜆 = eigenvalues

Principal components = eigenvectors w. largest eigenvalues

Page 35: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

1. Find eigenvalues by solving: det(𝚺 - 𝜆I) = 0

Page 36: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

1. Find eigenvalues by solving: det(𝚺 - 𝜆I) = 0

Page 37: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

1. Find eigenvalues by solving: det(𝚺 - 𝜆I) = 0

Page 38: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Want ||u1||=1

Page 39: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Want ||u1||=1

Page 40: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Want ||u1||=1

Page 41: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Want ||u1||=1

Page 42: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Want ||u1||=1

Page 43: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Want ||u1||=1

Page 44: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

Want ||u1||=1

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

Page 45: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Finding Principal Components

Want ||u1||=1

2. Find ith eigenvector by solving: 𝚺ui = 𝜆iui

2. 1st PC: and 2nd PC:

Page 46: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

How many PCs?

● Have eigenvectors u1, u2, …, un , want k < n

● eigenvalue 𝜆i = variance along ui

Page 47: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

How many PCs?

● Have eigenvectors u1, u2, …, un , want k < n

● eigenvalue 𝜆i = variance along ui

● Pick ui that explain the most variance:

○ Sort eigenvectors s.t. 𝜆1 > 𝜆2 > 𝜆3 > … > 𝜆n

○ Pick first k eigenvectors which explain 95% of total variance

Page 48: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

How many PCs?

● Have eigenvectors u1, u2, …, un , want k < n

● eigenvalue 𝜆i = variance along ui

● Pick ui that explain the most variance:

○ Sort eigenvectors s.t. 𝜆1 > 𝜆2 > 𝜆3 > … > 𝜆n

○ Pick first k eigenvectors which explain 95% of total variance

k

0.95

Page 49: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

How many PCs?

● Have eigenvectors u1, u2, …, un , want k < n

● eigenvalue 𝜆i = variance along ui

● Pick ui that explain the most variance:

○ Sort eigenvectors s.t. 𝜆1 > 𝜆2 > 𝜆3 > … > 𝜆n

○ Pick first k eigenvectors which explain 95% of total variance ■ Typical threshold: 90%, 95%, 99% k

0.95

Page 50: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA in a Nutshell (Eigen Decomposition)

1. Center the data (and normalize)

2. Compute covariance matrix 𝚺

3. Find eigenvectors u and eigenvalues 𝜆

4. Sort eigenvectors and pick first k eigenvectors

5. Project data to k eigenvectors

Page 51: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA AlgorithmBy Singular Value Decomposition

Page 52: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Data Preprocessing

Training set: x(1), x(2), ..., x(m)

Preprocessing (feature scaling/mean normalization):

Replace each with .

If different features on different scales, scale features to have comparable range of values.

Center the data

Page 53: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

Reduce data from n-dimensions to k-dimensions

Compute “covariance matrix”:

Compute “eigenvectors” of matrix 𝚺:

[U, S, V] = svd(sigma) Singular Value Decomposition

n ⨉ n matrix

Page 54: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

Reduce data from n-dimensions to k-dimensions

Compute “covariance matrix”:

Compute “eigenvectors” of matrix 𝚺:

[U, S, V] = svd(sigma) Singular Value Decomposition

n ⨉ n matrix

Page 55: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

From [U, S, V] = svd(sigma), we get:

∈ ℝnxn

Page 56: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

From [U, S, V] = svd(sigma), we get:

∈ ℝnxn

k

x∈ ℝn → z∈ ℝk

Page 57: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

From [U, S, V] = svd(sigma), we get:

∈ ℝnxn

k z =x

T

x∈ ℝn → z∈ ℝk

k ⨉ n n⨉1

Page 58: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA Algorithm

After mean normalization and optionally feature scaling:

[U, S, V] = svd(sigma)

z = (Ureduce)T x x

Page 59: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Choosing the Number ofPrincipal Components

Page 60: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Choosing k (#Principal Components)

[U, S, V] = svd(sigma)

Page 61: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Choosing k (#Principal Components)

[U, S, V] = svd(sigma)

xapprox = (Ureduce) z(i)

n⨉k n⨉1

Page 62: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Practical Issues

Page 63: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA: Practical Issues

● PCA assumes underlying subspace is linear

○ PCA cannot find a curve

Page 64: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

PCA: Practical Issues

● PCA assumes underlying subspace is linear

○ PCA cannot find a curve

● PCA and Classification

○ PCA is unsupervised

○ PCA can pick direction that makes hard to separate classes

Page 65: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Linear Discriminant Analysis (LDA)

● LDA pick a new dimension that gives:

○ Maximum separation between means of projected classes

○ Minimum variance within each projected class

● Solution: eigenvectors based on between-class and within-class covariance matrix

Page 66: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

Linear Discriminant Analysis (LDA)

µ1 - µ2

Page 67: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

https://www.youtube.com/playlist?list=PLBv09BD7ez_5_yapAg86Od6JeeypkS4YM

Page 68: Machine Learning and Pattern Recognitionrocha/teaching/2018s1/... · Machine Learning Books Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality Reduction”

References

Machine Learning Books

● Hands-On Machine Learning with Scikit-Learn and TensorFlow, Chap. 8 “Dimensionality

Reduction”

● Pattern Recognition and Machine Learning, Chap. 12 “Continuous Latent Variables”

● Pattern Classification, Chap. 10 “Unsupervised Learning and Clustering”

Machine Learning Courses

● https://www.coursera.org/learn/machine-learning, Week 8