1© 2018 The MathWorks, Inc.
Signal Processing for Deep Learning and Machine
Learning
Kirthi Devleker,
Sr. Product Manager, Signal Processing and Wavelets
2
Key Topics
▪ Signal analysis and visualization
▪ Time-Frequency analysis techniques
▪ Signal Pre-processing and Feature Extraction
▪ Automating Signal Classification
3
Signals are Everywhere
▪ Structural health monitoring (SHM)
▪ Engine event detection
▪ Speech Signal Classification
▪ Advanced surveillance
▪ Healthcare Applications
▪ ...
4
You don’t have to be a signal processing engineer to
work with signals
You don’t have to be a data scientist to do machine
learning and deep learning
© 2018 The MathWorks, Inc.
Overview
Develop
Data
exploration
Preprocessing
Analyze Data
Domain-specific
algorithms
Sensors
Files
Access Data
Databases
Desktop apps
Enterprise
systems
Deploy
Embedded
devices
Model
Modeling &
simulation
Algorithm
development
6
Deep Learning Overview
What is Deep Learning ?
Deep Neural Networks
Feature Detection and Extraction Prediction
…….......
7
Inside a Deep Neural Network
8
Two Demos…
Automatic Feature Extraction using Wavelet
Scattering Framework
Music Genre RecognitionEKG Classification
Transfer Learning using AlexNet CNN
9
Approaches for Signal Classification
▪ Transfer Learning for Signal Classification
▪ Automate Feature Extraction using Wavelet Scattering
▪ Using LSTM networks
10
Approaches for Signal Classification
▪ Transfer Learning for Signal Classification
▪ Automate Feature Extraction using Wavelet Scattering
▪ Using LSTM networks
11
Example 1: Signal Classification using Transfer Learning
▪ Goal: Given a set of labeled signals, quickly build a
classifier
▪ Dataset: 160 records with ~65K samples each
– Normal (Class I)
– Atrial Fibrillation (Class II)
– Congestive Heart Failure (Class III)
▪ Approach: Pre-trained Models
– AlexNet
▪ Out of Scope: CNN architecture parameter tuning
12
Overall Workflow – Transfer Learning on Signals
Signals Time-Frequency
Representations
Train Transfer
Learning Model
Trained Model
New Signal PredictTime-Frequency
Representation
13
Benefits of Transfer Learning
▪ Reference models are great feature extractors
– Initial layers learn low level features like edges etc.
▪ Replace final layers
– New layers learn features specific to your data
▪ Good starting point
AlexNetPRETRAINED
MODEL
GoogLeNetPRETRAINED MODEL
VGG-16PRETRAINED
MODEL
14
Steps in Transfer Learning Workflow
Preprocess Data
Re-configure the Layers
Set training options
Train the network
Test/deploy trained network
Repeat these steps
until network reaches
desired level of
accuracy
15
Thinking about Layers
▪ Layers are like Lego Blocks
– Stack them on top of each other
– Easily replace one block with a different one
▪ Each hidden layer has a special function
that processes the information from the
previous layer
16
Convolutional Neural Networks (CNNs)
▪ Special layer combinations that make them great for classification
– Convolution Layer
– Max Pooling Layer
– ReLU Layer
17
Convolution Layers Search for Patterns
These patterns would be common in the number 0
18
All patterns are compared to the patterns on a
new image.
• Pattern starts at left corner
Perform comparison
Slide over one pixel
• Reach end of image
• Repeat for next pattern
…
19
Good pattern matching in convolution improves
chances that object will classify properly
▪ This image would not match
well against the patterns for the
number zero
▪ It would only do
very well against
this pattern
20
Max Pooling is a down-sampling operation
Reduce dimensionality while preserving important information
1 0 5 4
3 4 8 3
1 4 6 5
2 5 4 1
4 8
5 6
2x2 filters
Stride Length = 2
21
Rectified Linear Units Layer (ReLU)
Typically converts negative numbers to zero
-1 0 5 4
3 -4 -8 3
1 4 6 -5
-2 -5 4 1
0 0 5 4
3 0 0 3
1 4 6 0
0 0 4 1
22
CNNs typically end with 3 Layers
▪ Fully Connected Layer
– Looks at which high-level features correspond to a specific category
– Calculates scores for each category (highest score wins)
▪ Softmax Layer
– Turns scores into probabilities.
▪ Classification Layer
– Categorizes image into one of the classes that the network is trained on
23
Recap– Transfer Learning on Signals
Signals Time-Frequency
Representation
Train Transfer
Learning Model
Trained Model
New Signal PredictTime-Frequency
Representation
24
Converting signals to time-frequency representations
▪ A time-frequency representation captures how spectral content of signal
evolves over time
– This pattern can be saved as an image.
▪ Example techniques include:
– spectrogram, mel-frequency spectrograms,
– Constant-Q Transforms
– scalogram (continuous wavelet transform), (Sharp Time-Frequency Patterns)
▪ Recall: Convolution Layers search for patterns
– Having sharp time-frequency representations helps in training models quicker
– Sharp time-frequency representations can enhance subtle information within signals that
appear very similar but belong to different classes
25
What is a wavelet?
▪ A wavelet is a rapidly decaying wave like oscillation with zero
mean
▪ Wavelets are best suited to localize frequency content in real
world signals
▪ MATLAB makes it easy by providing default wavelets
Sine wave
Wavelet
26
Time-Frequency Analysis - Comparison
Short Time Fourier Transform
- Fixed window size limits the resolution
Continuous Wavelet Transform
- Wavelets – well localized in time and frequency
- Variable sized windows capture features at different
scales simultaneously
- No need for specifying window size / type etc.
2
27
Demo 1: EKG Classification
▪ Goal: Given a set of labeled signals, quickly build a
classifier
▪ Dataset: 160 records with ~65K samples each
– Normal (Class I)
– Atrial Fibrillation (Class II)
– Congestive Heart Failure (Class III)
▪ Approach: Pre-trained Models
– AlexNet
▪ Out of Scope: CNN architecture parameter tuning
28
Overall Workflow – Transfer Learning on Signals
Signals Time-Frequency
Representation
Train Transfer
Learning Model
Trained Model
Wavelet based
New Signal PredictTime-Frequency
Representation
Training
InferenceGenerate GPU Code
29
Let’s try it out!Exercise: DeepLearningForSignals.mlx
30
Approaches for Signal Classification
▪ Transfer Learning for Signal Classification
▪ Automate Feature Extraction using Wavelet Scattering
▪ Using LSTM networks
31
Example 2: Music Genre Recognition using Wavelet Scattering
▪ Dataset: GTZAN Genre Classification[1]
▪ Approach: Automatic Feature Extraction using
Wavelet Scattering
▪ Key Benefits:
– No guess work involved (hyper parameter tuning etc.)
– Automatically extract relevant features ➔ 2 lines
[1] Tzanetakis, G. and Cook, P. 2002. Music genre classification of audio signals. IEEE Transactions on Speech and Audio
Processing, Vol. 10, No. 5, pp. 293-302.
http://marsyasweb.appspot.com/download/data_sets/
32
▪ Initial activations of some well trained CNNs resemble wavelet like filters
▪ Introducing Wavelet Scattering Framework [1]
– Automatic Feature Extraction
– Great starting point if you don’t have a lot of data
– Reduces data dimensionality and provides compact features
Class - I
Class - II
.
.
.
Background
[1] Joan Bruna, and Stephane Mallat, P. 2013. Invariant Scattering Convolution Networks. IEEE Transactions on Pattern
Analysis and Machine Intelligence, Vol. 35, No. 8, pp. 1872-1886.
https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=34
33
Working -Wavelet Scattering
Machine Learning
Wavelet
Scattering
Framework
FeaturesSignal
Min. Signal
Length
Classifier
34
More Info on Scattering Framework
▪ Q: What is a deep network ?
A: A network that does:
Convolution ➔ Filter signal with wavelets
Non-Linearity ➔ Modulus
Averaging ➔ Filter with Scaling function
▪ A comparison:
Wavelet Scattering Framework Convolutional Neural Network
Outputs at every layer Output at the end
Fixed filter weights Filter weights are learnt
35
||F *1|* 12| ||F *1|* 11|
|F *n| |F *2| |F *1|
Inner Workings: Wavelet Scattering Framework
F
Wavelet Filter
Scaling Filter
…….
|F *1|*|F *2|*
Layer 1
Layer 2
F *
|F *n|*
…
……
Layer 3
||F *1|* 11|*
||F *1|* 12|*…
Scattering Coefficients (S)
…
Scalogram Coefficients (U)
36
Wavelet Scattering Workflow
Create Scattering Filterbank
Automatically Extract Features
Train any classifier with features
Test/deploy
Only 2 Lines
37
Let’s try it out!
38
Approaches for Signal Classification
▪ Transfer Learning for Signal Classification
▪ Automate Feature Extraction using Wavelet Scattering
▪ Using LSTM networks
39
Deep Learning with LSTMs - Examples
▪ Sequence Classification Using Deep Learning
▪ This example shows how to classify sequence data using a long short-
term memory (LSTM) network.
▪ Sequence-to-Sequence Classification Using Deep Learning
▪ This example shows how to classify each time step of sequence data
using a long short-term memory (LSTM) network.
▪ Sequence-to-Sequence Regression Using Deep Learning
▪ This example shows how to predict the remaining useful life (RUL) of
engines by using deep learning.
▪ …and many more
https://www.mathworks.com/help/deeplearning/examples/classify-sequence-data-using-lstm-networks.htmlhttps://www.mathworks.com/help/deeplearning/examples/sequence-to-sequence-classification-using-deep-learning.htmlhttps://www.mathworks.com/help/deeplearning/examples/sequence-to-sequence-regression-using-deep-learning.html
40
Signal Pre-processing / Feature Extraction
▪ Signal Pre-processing
– Wavelet Signal Denoiser
▪ Changepoint Detection
▪ Compare signals using Dynamic Time
Warping
▪ Reconstruct missing samples
▪ … ..
41
Leverage built-in algorithms
How much have I not needed to re-invent?
▪ Signal Processing Toolbox
▪ Wavelet Toolbox
▪ Deep Learning Toolbox
▪ Statistics and Machine Learning Toolbox
▪ Parallel Computing Toolbox
▪ cwt
▪ filter
▪ dwt/modwt
▪ pwelch
▪ periodogram
▪ xcov
▪ findpeaks
▪ …
42
Thank You
Top Related