Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks...

16
Introduction to Deep Learning (I2DL) Exercise 5: Neural Networks and CIFAR10 Classification I2DL: Prof. Niessner 1

Transcript of Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks...

Page 1: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Introduction to Deep Learning (I2DL)

Exercise 5: Neural Networks and CIFAR10 Classification

I2DL: Prof. Niessner 1

Page 2: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Today’s Outline

• Neural Networks

– Mathematical Motivation

– Modularization

• Exercise 5

– Implementation Loop

– CIFAR10 Classification

2I2DL: Prof. Niessner

Page 3: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Our Goal

3

𝑓

I2DL: Prof. Niessner

Page 4: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Universal Approximation Theorem

Theorem (1989, colloquial)For any continuous function 𝑓 on a compact set 𝐾, there exists a one layer neural network, having only a single hidden layer + sigmoid, which uniformly approximates 𝑓 to within an arbitrary 𝜀 > 0 on 𝐾.

4I2DL: Prof. Niessner

Page 5: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Universal Approximation Theorem (Optional)

Readable proof:https://mcneela.github.io/machine_learning/2017/03/21/Universal-Approximation-Theorem.html(Background: Functional Analysis, Math Major 3rd semester)

Visual proof:http://neuralnetworksanddeeplearning.com/chap4.html

5I2DL: Prof. Niessner

Page 6: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

A word of warning…

6

Source: http://blog.datumbox.com/wp-content/uploads/2013/10/gradient-descent.png

I2DL: Prof. Niessner

Page 7: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Shallow vs. Deep

• Shallow(1 hidden layer)

• Deep(>1 hidden layer)

7I2DL: Prof. Niessner

Page 8: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Obvious Questions

• Q: Do we even need deep networks?A: Yes. Multiple layers allow for more representation

power given a fixed computational budget incomparison to a single layer

• Q: So we just build 100 layer deep networks?A: Not trivially ;-)

Contraints: Memory, vanishing gradients, …

8I2DL: Prof. Niessner

Page 9: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Exercise 4: Simple Classification Net

9I2DL: Prof. Niessner

Page 10: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Modularization

10I2DL: Prof. Niessner

Page 11: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Exercise 3: Dataset

11

Data Model Solver

Dataset

Dataloader

Network

Loss/Objective

Optimizer

Training Loop

Validation

I2DL: Prof. Niessner

Page 12: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Exercise 4: Binary Classification

12

Data Model Solver

Dataset

Dataloader

Network

Loss/Objective

Optimizer

Training Loop

Validation

I2DL: Prof. Niessner

Page 13: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Exercise 5: Neural Networks

13

Data Model Solver

Dataset

Dataloader

Network

Loss/ObjectiveLoss/Objective

Optimizer

Training Loop

Validation

Optimizer

Training Loop

Validation

and CIFAR10 Classification

I2DL: Prof. Niessner

Page 14: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Exercise 6: Neural Networks

14

Model

Network

Loss/ObjectiveLoss/Objective

Solver

Optimizer

Training Loop

Validation

Optimizer

Training Loop

Validation

and Hyperparameter Tuning

Data

Dataset

Dataloader

Dataset

Dataloader

I2DL: Prof. Niessner

Page 15: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

Summary

• Monday 17.05: Watch Lecture 6 – Training NNs

• Wednesday 19.05 15:59: Submit exercise 5

• Thursday 20.05: Tutorial 6– Hyper-parameter Tuning

I2DL: Prof. Niessner 15

Page 16: Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks – Mathematical Motivation – Modularization • Exercise 5 – Implementation

See you next week

I2DL: Prof. Niessner 16