Lecture 9: ANN Architectures - Sharif

44
In the Name of God Lecture 9: ANN Architectures

Transcript of Lecture 9: ANN Architectures - Sharif

Page 1: Lecture 9: ANN Architectures - Sharif

In the Name of God

Lecture 9: ANN Architectures

Page 2: Lecture 9: ANN Architectures - Sharif

Biological Neurong

Page 3: Lecture 9: ANN Architectures - Sharif

Organization of Levels in Brainsg

• map into cerebral cortex, pathways, Central Nervous sys

columns, topographic maps; involve multiple regions

• neurons of similar and different propertiesLocal circuits

Interregional circuits

• neurons of similar and different properties, 1 mm in size, localized region in the brain

• 100m in size contains several dendriteNeurons

Local circuits

• 100m in size, contains several dendrite treesDendrite tree

Synapses

Neural microcircuits

Molecules

y p

Page 4: Lecture 9: ANN Architectures - Sharif

Biological Analogy g gy

• Brain NeuronBrain Neuron

w1

• Artificial neuron(processing element)

f(net)

Inpu

ts

w2

• Set of processing

wn

X1Set of processing elements (PEs) and connections (weights) with adjustable strengths

X

X3

X2

OutputLayer

InputLayer

X4

X5

Hidden Layer

Page 5: Lecture 9: ANN Architectures - Sharif

ANN: History y

• Pavlov’s conditioning experiments: a conditioned response, salivation in response to the auditory stimulusresponse to the auditory stimulus

• Lots of activities concerning automatas, communication, computation, understanding of nervous system during 1930s and 1940s

• McCulloch and Pitts 1943• McCulloch and Pitts 1943• von Neumann EDVAC (Electronic Discrete Variable Automatic Computer)• Hebb: The Organization of Behavior, 1949• Minsky: Theory of Neural Analog Reinforcement Systems and Its Application• Minsky: Theory of Neural-Analog Reinforcement Systems and Its Application

to the Brain-Model Problem (Reinforcement learning), 1954• The problem of designing an optimum linear filter: Kolmogorov 1942, Wiener

1949, Zadeh 1953, Gabor 1954, ,• Uttley: leaky integrate and fire neuron, 1956• Rosenblatt: the perceptron, 1958

Page 6: Lecture 9: ANN Architectures - Sharif

ANN: Historyy

• Long-Term Potential, LPT, (1973 Bliss,Lomo), AMPA receptor, Long-Term Depression LTD NMDA receptorDepression, LTD, NMDA receptor,

• The nearest neigbbor rule by Fix and Hodges 1951• Least mean square algorithm by Widraw and Hoff in 1960• The use of stochastic gradient in adaptive pattern classification by Amari in 1967• The idea of competive learning: von der Malsburg 1973, the self-organization of

orientation-sensitive nerve cells in the striate cortex• Self organized maps by Grossberg in 70s• Saving units (associative networks) by Anderson and Kohonen in 1982• Saving units (associative networks) by Anderson and Kohonen in 1982• Recurrent neural networks by Hopfield in 1982• Error backpropagation learning algorithm by Rumelhart, Hinton and Williams in

1986• Spike timing dependence plasticity by Markram in 1997• Echo state networks by Jaeger in 2002• Lots of applications • ...

Page 7: Lecture 9: ANN Architectures - Sharif

(Artificial) Neural Network ?( )

• Computational model inspired from neurological model of brain

• Human brain computes in different way from digital computerdigital computer▫ highly complex, nonlinear, and parallel computing▫ many times faster than a computer in y p pattern recognition, perception, motor control

▫ has great structure and ability to build up its own rules by experiencerules by experience dramatic development within 2 years after birth continues to develop afterward

L L i D i b f 13 ld Language Learning Device before 13 years old▫ Plasticity: ability to adapt to its environment

Page 8: Lecture 9: ANN Architectures - Sharif

Neural Network Definitions

• Machine designed to model the way in which brain performs tasksperforms tasks▫ implemented by electronic devices and/or software

(simulation)▫ Learning is the major emphasis of NNLearning is the major emphasis of NN

• Massively parallel distributed processor▫ massive interconnection of simple processing units▫ simple processing units store experience and make it s p e p ocess g u ts sto e e pe e ce a d a e t

available to use▫ knowledge is acquired from environment thru learning

processL i M hi• Learning Machine▫ modify synaptic weights to obtain design objective▫ modify own topology - neurons die and new one can growConnectionist network connectionism• Connectionist network - connectionism

Page 9: Lecture 9: ANN Architectures - Sharif

Benefits of Neural Networks (I)( )

• Power comes from massively parallel distributed structure and learn to generalizestructure and learn to generalize▫ generalization: ability to produce reasonable output for

inputs not encountered during trainingNN cannot provide solution by working individually• NN cannot provide solution by working individually▫ Complex problem is decomposed into simple tasks,

and each task is assigned to a NNLong way to go to build a computer that mimics▫ Long way to go to build a computer that mimics human brain

• Non-linearityinterconnection of non linear neurons is itself non▫ interconnection of non-linear neurons is itself non-linear

▫ desirable property if underlying physical mechanism is non-linearnon-linear

Page 10: Lecture 9: ANN Architectures - Sharif

Benefits of Neural Networks (II)( )

• Input-Output Mapping▫ input-output mapping is built by learning from examples reduce differences of desired response and actual response

▫ non-parametric statistical inference estimate arbitrary decision boundaries in input signal space

• Adaptivity▫ adapt synaptic weight to changes of environment▫ NN is retrained to deal with minor change in the operating

environment change synaptic weights in real-time

▫ more robust, reliable behavior in non-stationary environment

▫ Adaptive pattern recognition, Adaptive signal processing, Adaptive control

▫ stability-plasticity dilemma

Page 11: Lecture 9: ANN Architectures - Sharif

Benefits of Neural Networks (III)( )

• Evidential Responset l l t d l l b l b t l fid▫ not only selected class label but also confidence

▫ confidences can be used to reject recognition accuracy vs. reliability (do only what you can

do)do)• Contextual Information processing▫ (contextual) knowledge is presented in the structure

every neuron is affected by others▫ every neuron is affected by others• Fault Tolerance▫ performance degrades gracefully under adverse

conditioncondition catastrophic failure of digital computer

• VLSI implementabilityi l ll l t k it ll it d f VLSI▫ massively parallel nature makes it well suited for VLSI

implementation

Page 12: Lecture 9: ANN Architectures - Sharif

Benefits of Neural Networks (IV)( )

• Uniformity of Analysis and Designy y g▫ Neuron is common to all NN▫ share theories and learning algorithms▫ modular networks can be built thru seamless

integrationNeurobiological Analogy• Neurobiological Analogy▫ living proof of fault tolerant, fast, powerful

processingprocessing▫ Neuroscientists see it as a research tool for

neurobiological phenomena▫ Engineers look to neuroscience for new ideas

Page 13: Lecture 9: ANN Architectures - Sharif

ANN: Architectures

Inputs WeightsPE

Perceptron Multiple Layer Feedforward

PEs PEs PEs

5, 3, 2, 5, 3

1, 0, 0, 1, 0

PEs

Outputs

5, 3, 2, 5, 3

5, 3, 2, 5, 3

1 0 0 1 0

InputsWeights

5, 3, 2, 5, 3

WeightsWeights

PEs PEs PEs

O t t5, 3, 2, 2, 1

Exemplar

Epoch

1, 0, 0, 1, 0

5, 3, 2, 2, 1

HiddenLayer

HiddenLayer

OutputLayer

Output

p

Inputs

Recurrent/Feedback Time Lag Feedforward

Inputs Memory Structure

5, 3, 2, 5, 3

5, 3, 2, 2, 1

5, 3, 2, 5, 3 5, 3, 2, 5, 3 Mem

Mem

Mem

5, 3, 2, 5, 3 Mem

Mem

Mem

Page 14: Lecture 9: ANN Architectures - Sharif

ANN: What Makes them “Unique”

• Neural networks are nonlinear models▫ Many other nonlinear models exist mathematics required is usually involved or nonexistent.

▫ simplified nonlinear systemcombinations of simple nonlinear functions▫ combinations of simple nonlinear functions

• Neural networks are trained from the dataN t k l d i i d b f h d▫ No expert knowledge is required beforehand

▫ They can learn and adapt to changing conditions online

They are universal approximators• They are universal approximators▫ learn any model given enough data and processing elements

They have very few formal assumptions about the data• They have very few formal assumptions about the data▫ (e.g. no Gaussian requirements, etc.)

Page 15: Lecture 9: ANN Architectures - Sharif

ANN: How do neural nets work?

TRAIN THE NETWORK:1 I t d d t1. Introduce data 2. Computes an output3. Output compared to desired output4 Weights are modified to reduce error4. Weights are modified to reduce error

USE THE NETWORK:1 Introduce new data to the network1. Introduce new data to the network2. Network computes an output based on its training

input output

Page 16: Lecture 9: ANN Architectures - Sharif

ANN: Generalization

• Neural networks are very powerful, often too powerful• Can overtrain a neural network▫ will perform very well on data that it was trained with▫ but poorly on test databut poorly on test data

• Never judge a network based upon training data results ONLY!

Page 17: Lecture 9: ANN Architectures - Sharif

ANN: Multiple Datasets

• The most common solution to the “generalization” problem is to divide your data into 3 sets:▫ Training data: used to train network

▫ Cross Validation data: used to actively test the network during training - used to stop training

▫ Testing data: g used to test the network after training

▫ Production data: desired output is not known (implementation)p ( p )

Page 18: Lecture 9: ANN Architectures - Sharif

Models of Neuron

• Neuron is information processing unitp g

• A set of synapses or connecting links▫ characterized by weight or strength

• An adder▫ summing the input signals weighted by synapses▫ a linear combinerAn activation function• An activation function▫ also called squashing function squash (limits) the output to some finite valuessquash (limits) the output to some finite values

Page 19: Lecture 9: ANN Architectures - Sharif

Nonlinear model of a neuron (I)( )

Bias

wk1x1

bkActivationfunction

wk2x2

... ... (.)vk Output

yk

wkmxm

SummingInputsignal

Synapticweights

Summingjunction

m

bxwv kj

m

jkjk

1)(vy kk

Page 20: Lecture 9: ANN Architectures - Sharif

Nonlinear model of a neuron (II)( )

wk0X0 = +1 Wk0 = bk (bias)

wk1x1 Activation

function

k0

wk2x2

... ... (.)vk Output

yk

wkmxm

SummingInputsignal

Synapticweights

Summingjunction

xwv j

m

jkjk

0

)(vy kk

Page 21: Lecture 9: ANN Architectures - Sharif

Types of Activation Functiony

OO O Oj

+1

Oj

+1

Oj

+1

ininit in iniit

Threshold Function Piecewise-linearFunction

Sigmoid Function(diff ti bl )

init

Function (differentiable)

)(11)(v

)exp(1)(

av

a is slope parameter

Page 22: Lecture 9: ANN Architectures - Sharif

Activation Function value range g

+1+1

vvi

vi

-1

Signum Function

Hyperbolic tangent Function

)tanh()( vv g )tanh()( vv

Page 23: Lecture 9: ANN Architectures - Sharif

The McCulloch-Pitts Model

• McCulloch and Pitts (1943) produced the first ( ) pneural network, which was based on their artificial neuron.

• The activation of a neuron is binary.• The neuron either fires (activation of one) or

d t fi ( ti ti f )does not fire (activation of zero).• Neurons in a McCulloch-Pitts network are

connected by directed and weighted pathsconnected by directed and weighted paths.

Page 24: Lecture 9: ANN Architectures - Sharif

The McCulloch-Pitts Model

• For the network shown below the activation function for unit Y is: f(y_in) = 1, if y_in >= Telse 0where y_in is the total input signal received andT i th th h ld f YT is the threshold for Y.

Outputw1x1

w1x0

Inputs

Outputw2

1

wn.

… Y

x2

x n.xnb

Page 25: Lecture 9: ANN Architectures - Sharif

Example: Logical Functionsg

a0 a0 a0

W0 = 1.5

W1 = 1

a1

0

W0 = 0.5

W1 = 1

a1

0

W0 = -0.50

W2 = 1

AND

a2

W2 = 1

OR

a2W1 = -1

NOTa1

• McCulloch and Pitts: some Boolean functions can be implemented with an artificial neuron (not XOR).p ( )

Page 26: Lecture 9: ANN Architectures - Sharif

two-layer network capable of calculating XORcalculating XOR

Page 27: Lecture 9: ANN Architectures - Sharif

Stochastic Model of a Neuron

• Deterministic vs stochastic• stochastic: stay at a state with probability P

)(1 vPyprobabilitwithx

t t f

)(11 vPyprobabilitwithx

x: state of neuronv: induced local field (input sum)P(v) probability of firing )exp(1

1)(

TvvP

P(v) probability of firing

where T is pseudotemparature

T

T 0, reduced to deterministic form

Page 28: Lecture 9: ANN Architectures - Sharif

NNs as directed Graphs

• Block diagram can be simplified by the idea of g p ysignal flow graph

• node is associated with signal• directed link is associated with transfer function▫ synaptic links

d b li i t t t l ti governed by linear input-output relation signal xj is multiplied by synaptic weight wkj

▫ activation linksactivation links governed by nonlinear input-output relation nonlinear activation function

Page 29: Lecture 9: ANN Architectures - Sharif

Signal Flow Graph of a Neurong

x1

x0 = +1Wk0 = bk

wk1

x2vk yk

(.)wk2

xm

...

wkm

Page 30: Lecture 9: ANN Architectures - Sharif

Architectural graph of a Neurong

• Partially complete directed graph describing y p g p glayout

• Three graphical representations▫ Block diagram - providing functional description of

NNa NN▫ Signal flow graph - complete description of signal

flowflow▫ architectural graph - network layout

Page 31: Lecture 9: ANN Architectures - Sharif

Network Architecture

• Single-layer Feedforward Networks▫ input layer and output layer single (computation) layer

▫ feedforward acyclic▫ feedforward, acyclic• Multilayer Feedforward Networks▫ hidden layers - hidden neurons and hidden unitsy▫ enables to extract high order statistics▫ 10-4-2 network, 100-30-10-3 network▫ fully connected layered network▫ fully connected layered network

• Recurrent Networks▫ at least one feedback loopp▫ with or without hidden neuron

Page 32: Lecture 9: ANN Architectures - Sharif

Network Architecture

Single layer Multiple layerfully connected U it d lfully connected Unit delay

operator

Recurrent networkith t hidd itwithout hidden units

outputs outputs

inputs Recurrent networkwith hidden units

Page 33: Lecture 9: ANN Architectures - Sharif

Feedback

• Output is fed-back to the NN that is used in determining the output itselfdetermining the output itself

x (n)

xj’(n)

w yk(n)xj(n) yk(n)

z-1 )()( 1 lnnk xwy l

• depending on w▫ stable linear divergence exponential divergence

)()(0

lnnk xwy ji

stable, linear divergence, exponential divergence▫ we are interested in the case of |w| <1 ; infinite

memory output depends on inputs of infinite pastoutput depends on inputs of infinite past

• NN with feedback loop : recurrent network

Page 34: Lecture 9: ANN Architectures - Sharif

Knowledge Representationg

• Knowledge refers to stored information or models used by a person or machine to interpret predict andby a person or machine to interpret, predict and appropriately respond to the outside world▫ What information is actually made explicit;▫ How the information is physically encoded for theHow the information is physically encoded for the

subsequent use• Good solution depends on good representation of

knowledgeg• In NN, knowledge is represented by internal network

parameters▫ real challenge

• Knowledge of the world▫ world state represented by known facts - prior knowledge▫ observations - obtained by (noisy) sensors; training

lexamples

Page 35: Lecture 9: ANN Architectures - Sharif

Knowledge Acquisition by NN TrainingTraining

• Training examples: either labeled or unlabeled labeled : input signal and desired response unlabeled : different realizations of input signal

▫ Examples represent the knowledge of environmentp p g• Character recognition

1. Appropriate architecture is selected for NN source node = number of pixels of input image source node = number of pixels of input image e. g. 26 output node for each digit subset of examples for training NN by suitable learning

algorithmalgorithm2. Recognition performance is tested by the rest of the

examplesP iti d ti l• Positive and negative examples

Page 36: Lecture 9: ANN Architectures - Sharif

Classification: Optical Character RecognitionRecognition

• Determine if the input image is the A B C

Ais the A,B,C,…

• 2 classes :create one output for each class (e.g. class 0: true or false etc )

B

EDC

true or false, etc.). • 26 outputs (A…Z). Each

image is labeled with a class

• image A will be

E

• image A will be (1,0,0,0,0,0,0,0,0,0)

• image B will be (0 1 0 0 0 0 0 0 0 0)(0,1,0,0,0,0,0,0,0,0), etc.

• Must train the network to Output Layer

Hidden Layer

ust t a t e et o torecognize the alphabets

Input Layer

Page 37: Lecture 9: ANN Architectures - Sharif

Rules of Knowledge representation in NNin NN

• Similar input from similar classes produce similar representationsrepresentations▫ similarity measures Euclidian distance, dot (inner) product, cos

random ariable Mahalanobis distance random variable : Mahalanobis distance ...

• Separate classes produce widely different representationsrepresentations

• More neurons should be involved in representation of more important feature

b bilit f d t ti / f l l▫ probability of detection / false alarm• Prior information and invariances should be built

into the design of the networkl i li d▫ general purpose vs specialized

Page 38: Lecture 9: ANN Architectures - Sharif

Building Prior to NN designg g

• Specialized structurep▫ learns fast because of small free parameters▫ runs fast because of simple structure

• No well-defined rules for building specialized NN▫ ad hoc approach

t i ti th t k hit t th h i▫ restricting the network architecture through using local connections receptive fieldreceptive field

▫ Constraining the choice of synaptic weights weight sharing, parameter tying

Page 39: Lecture 9: ANN Architectures - Sharif

Building invariance to NN designg g

• Want to be capable to cope with transformations▫ Invariance by structure synaptic connections are arranged not to by affected by

transformation rotation invariant forcing wji = wjk for all k in the same

distance from the center of image▫ Invariance by trainingy g train by data of many different transformations computationally infeasible

▫ invariant feature spaceinvariant feature space use features invariant to the transformations

• No well-developed theory of optimizing architectureof NNof NN

• NN lacks explanation capability

Page 40: Lecture 9: ANN Architectures - Sharif

AI and NN• Definition of AI; Goal of AI▫ art of creating machine that performs tasks that requiresart of creating machine that performs tasks that requires

intelligence when performed by people▫ study of mental faculties through the use of

computational modelsp▫ to make computers to perceive, reason and act▫ to develop machine that perform cognitive tasks

• functions of AI systemfunctions of AI system store knowledge apply the knowledge to solve problems acquire new knowledge thru experienceq g p

• Key components of AI▫ representation▫ reasoning learning

representation

g▫ learning

g

reasoning

Page 41: Lecture 9: ANN Architectures - Sharif

AI

• AI is goal, objective, dreamg , j ,• NN is a model of intelligent system▫ it is not the only system▫ Intelligent system is not necessarily same as

humanExample : Chess machine Example : Chess machine

• Symbolic AI is a tool, paradigm toward AI• NN can be a good tool toward AI• NN can be a good tool toward AI

Page 42: Lecture 9: ANN Architectures - Sharif

Reasoningg

• Reasoning is ability to solve problem▫ must able to express and solve broad range of

problems▫ must able to make explicit and implicit information▫ must able to make explicit and implicit information

known to it▫ must have control mechanism to select operators

f it tifor a situation• Problem solving is a searching problem• deal with incompleteness inexactness• deal with incompleteness, inexactness,

uncertainty▫ probabilistic reasoning, plausible reasoning,

ffuzzy reasoning

Page 43: Lecture 9: ANN Architectures - Sharif

Learningg

• Model of Machine Learningg▫ Environment, ▫ Learning element,

K l d b d▫ Knowledge base, and ▫ performance cycle

• Inductive learningInductive learning▫ generate rules from raw data▫ similarity-based learning, case-based reasoning

• deductive learning▫ general rules are used to determine specific facts▫ theorem proving▫ theorem proving

• Augmenting knowledge-base is not a simple task

Page 44: Lecture 9: ANN Architectures - Sharif

Readingg

• S Haykin, Neural Networks: A Comprehensive y , pFoundation, 2007 (Chapter 1).