Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks...

Post on 16-Aug-2021

3 views 2 download

Transcript of Introduction to Deep Learning (I2DL)...I2DL: Prof. Niessner 1 Today’s Outline • Neural Networks...

Introduction to Deep Learning (I2DL)

Exercise 5: Neural Networks and CIFAR10 Classification

I2DL: Prof. Niessner 1

Today’s Outline

• Neural Networks

– Mathematical Motivation

– Modularization

• Exercise 5

– Implementation Loop

– CIFAR10 Classification

2I2DL: Prof. Niessner

Our Goal

3

𝑓

I2DL: Prof. Niessner

Universal Approximation Theorem

Theorem (1989, colloquial)For any continuous function 𝑓 on a compact set 𝐾, there exists a one layer neural network, having only a single hidden layer + sigmoid, which uniformly approximates 𝑓 to within an arbitrary 𝜀 > 0 on 𝐾.

4I2DL: Prof. Niessner

Universal Approximation Theorem (Optional)

Readable proof:https://mcneela.github.io/machine_learning/2017/03/21/Universal-Approximation-Theorem.html(Background: Functional Analysis, Math Major 3rd semester)

Visual proof:http://neuralnetworksanddeeplearning.com/chap4.html

5I2DL: Prof. Niessner

A word of warning…

6

Source: http://blog.datumbox.com/wp-content/uploads/2013/10/gradient-descent.png

I2DL: Prof. Niessner

Shallow vs. Deep

• Shallow(1 hidden layer)

• Deep(>1 hidden layer)

7I2DL: Prof. Niessner

Obvious Questions

• Q: Do we even need deep networks?A: Yes. Multiple layers allow for more representation

power given a fixed computational budget incomparison to a single layer

• Q: So we just build 100 layer deep networks?A: Not trivially ;-)

Contraints: Memory, vanishing gradients, …

8I2DL: Prof. Niessner

Exercise 4: Simple Classification Net

9I2DL: Prof. Niessner

Modularization

10I2DL: Prof. Niessner

Exercise 3: Dataset

11

Data Model Solver

Dataset

Dataloader

Network

Loss/Objective

Optimizer

Training Loop

Validation

I2DL: Prof. Niessner

Exercise 4: Binary Classification

12

Data Model Solver

Dataset

Dataloader

Network

Loss/Objective

Optimizer

Training Loop

Validation

I2DL: Prof. Niessner

Exercise 5: Neural Networks

13

Data Model Solver

Dataset

Dataloader

Network

Loss/ObjectiveLoss/Objective

Optimizer

Training Loop

Validation

Optimizer

Training Loop

Validation

and CIFAR10 Classification

I2DL: Prof. Niessner

Exercise 6: Neural Networks

14

Model

Network

Loss/ObjectiveLoss/Objective

Solver

Optimizer

Training Loop

Validation

Optimizer

Training Loop

Validation

and Hyperparameter Tuning

Data

Dataset

Dataloader

Dataset

Dataloader

I2DL: Prof. Niessner

Summary

• Monday 17.05: Watch Lecture 6 – Training NNs

• Wednesday 19.05 15:59: Submit exercise 5

• Thursday 20.05: Tutorial 6– Hyper-parameter Tuning

I2DL: Prof. Niessner 15

See you next week

I2DL: Prof. Niessner 16