CS621: Artificial Intelligence Lecture 27: Backpropagation applied to recognition problems; start of...

Post on 11-Jan-2016

215 views 1 download

Transcript of CS621: Artificial Intelligence Lecture 27: Backpropagation applied to recognition problems; start of...

CS621: Artificial IntelligenceLecture 27: Backpropagation applied

to recognition problems; start of logic

Pushpak BhattacharyyaComputer Science and Engineering

DepartmentIIT Bombay

Backpropagation algorithm

• Fully connected feed forward network• Pure FF network (no jumping of connections

over layers)

Hidden layers

Input layer (n i/p neurons)

Output layer (m o/p neurons)

j

i

wji

….

….

….

….

General Backpropagation Rule

ijjk

kkj ooow )1()(layernext

)1()( jjjjj ooot

iji jow • General weight updating rule:

• Where for outermost layer

for hidden layers

Local Minima

Due to the Greedy nature of BP, it can get stuck in local minimum m and will never be able to reach the global minimum g as the error can only decrease by weight change.

Momentum factor

1. Introduce momentum factor.

Accelerates the movement out of the trough. Dampens oscillation inside the trough. Choosing β : If β is large, we may jump over the

minimum.

iterationthnjiijiterationnthji wOw )1()()(

Symmetry breaking• If mapping demands different weights, but we start with the

same weights everywhere, then BP will never converge.

w2=1w1=1θ = 0.5

x1x2 x1x2

-1

x1 x2

-11.5

1.5

1 1

XOR n/w: if we sstarted with identicalweight everywhere, BPwill not converge

Backpropagation Applications

Problem defined

Decided by trial error

Problem defined

O/P layer

Hidden layer

I/P layer

Feed Forward Network Architecture

Digit Recognition Problem

• Digit recognition:– 7 segment display– Segment being on/off defines a digit

1

2

3

6

7

45

9O 8O 7O . . . 2O 1O

Hidden layer

Full connection

Full connection

7O 6O 5O . . . 2O 1OSeg-7 Seg-6 Seg-5 Seg-2 Seg-1

Example - Character Recognition

• Output layer – 26 neurons (all capital)• First output neuron has the responsibility of

detecting all forms of ‘A’• Centralized representation of outputs• In distributed representations, all output

neurons participate in output

An application in Medical Domain

Expert System for Skin Diseases Diagnosis

• Bumpiness and scaliness of skin• Mostly for symptom gathering and for

developing diagnosis skills• Not replacing doctor’s diagnosis

Architecture of the FF NN• 96-20-10• 96 input neurons, 20 hidden layer neurons, 10

output neurons• Inputs: skin disease symptoms and their parameters

– Location, distribution, shape, arrangement, pattern, number of lesions, presence of an active norder, amount of scale, elevation of papuls, color, altered pigmentation, itching, pustules, lymphadenopathy, palmer thickening, results of microscopic examination, presence of herald pathc, result of dermatology test called KOH

Output

• 10 neurons indicative of the diseases:– psoriasis, pityriasis rubra pilaris, lichen planus,

pityriasis rosea, tinea versicolor, dermatophytosis, cutaneous T-cell lymphoma, secondery syphilis, chronic contact dermatitis, soberrheic dermatitis

Training data

• Input specs of 10 model diseases from 250 patients

• 0.5 is some specific symptom value is not knoiwn

• Trained using standard error backpropagation algorithm

Testing• Previously unused symptom and disease data of 99 patients• Result:• Correct diagnosis achieved for 70% of papulosquamous group

skin diseases• Success rate above 80% for the remaining diseases except for

psoriasis• psoriasis diagnosed correctly only in 30% of the cases• Psoriasis resembles other diseases within the

papulosquamous group of diseases, and is somewhat difficult even for specialists to recognise.

Explanation capability

• Rule based systems reveal the explicit path of reasoning through the textual statements

• Connectionist expert systems reach conclusions through complex, non linear and simultaneous interaction of many units

• Analysing the effect of a single input or a single group of inputs would be difficult and would yield incor6rect results

Explanation contd.

• The hidden layer re-represents the data• Outputs of hidden neurons are neither

symtoms nor decisions

Figure : Explanation of dermatophytosis diagnosis using the DESKNET expert system.

5 (Dermatophytosis node)

0( Psoriasis node )

Disease diagnosis

-2.71

-2.48

-3.46

-2.68

19

14

13

0

1.621.43

2.13

1.68

1.58

1.22

Symptoms & parametersDuration

of lesions : weeks 0

1

6

10

36

171

95

96

Duration of lesions : weeks

Minimal itching

Positive KOH test

Lesions locatedon feet

Minimalincrease

in pigmentation

Positive test forpseudohyphae

And spores

Bias

Internalrepresentation

1.46

20 Bias

-2.86

-3.31

9 (Seborrheic dermatitis node)

Discussion

• Symptoms and parameters contributing to the diagnosis found from the n/w

• Standard deviation, mean and other tests of significance used to arrive at the importance of contributing parameters

• The n/w acts as apprentice to the expert

Exercise

• Find the weakest condition for symmetry breaking. It is not the case that only when ALL weights are equal, the network faces the symmetry problem.

Logic

Logic and inferencing

Vision NLP

Expert Systems

Planning

Robotics

Search Reasoning Learning Knowledge

Obtaining implication of given facts and rules -- Hallmark of intelligence

Inferencing through

− Deduction (General to specific)− Induction (Specific to General)− Abduction (Conclusion to hypothesis in absence of any other evidence

to contrary)

Deduction

Given: All men are mortal (rule)Shakespeare is a man (fact)

To prove: Shakespeare is mortal (inference)

Induction

Given: Shakespeare is mortal Newton is mortal (Observation)Dijkstra is mortal

To prove: All men are mortal (Generalization)

If there is rain, then there will be no picnic

Fact1: There was rainConclude: There was no picnic

Deduction

Fact2: There was no picnicConclude: There was no rain (?)

Induction and abduction are fallible forms of reasoning. Their conclusions are susceptible to retraction

Two systems of logic

1) Propositional calculus2) Predicate calculus

Propositions

− Stand for facts/assertions− Declarative statements

− As opposed to interrogative statements (questions) or imperative statements (request, order)

Operators

=> and ¬ form a minimal set (can express other operations)- Prove it.

Tautologies are formulae whose truth value is always T, whatever the assignment is

)((~),),(),( NIMPLICATIONOTORAND

Model

In propositional calculus any formula with n propositions has 2n models (assignments)

- Tautologies evaluate to T in all models.

Examples: 1)

2)

-e Morgan with AND

PP

)()( QPQP

Semantic Tree/Tableau method of proving tautology

Start with the negation of the formula

α-formula

β-formula

α-formula

pq

¬q¬ p

- α - formula

- β - formula

)]()([ QPQP

)( QP

)( QP

- α - formula

Example 2:

B C B CContradictions in all paths

X

α-formula ¬A ¬C

¬A ¬B ¬A ¬B

AB∨C

AB∨C A

B∨CAB∨C

(α - formulae)

(β - formulae)

(α - formula)

)]()()([ CABACBA

)( CBA

))()(( CABA

)( BA

))( CA

A puzzle(Zohar Manna, Mathematical Theory of

Computation, 1974)

From Propositional Calculus

Tourist in a country of truth-sayers and liers

• Facts and Rules: In a certain country, people either always speak the truth or always lie. A tourist T comes to a junction in the country and finds an inhabitant S of the country standing there. One of the roads at the junction leads to the capital of the country and the other does not. S can be asked only yes/no questions.

• Question: What single yes/no question can T ask of S, so that the direction of the capital is revealed?

Diagrammatic representation

S (either always says the truthOr always lies)

T (tourist)

Capital

Deciding the Propositions: a very difficult step- needs human intelligence

• P: Left road leads to capital• Q: S always speaks the truth

Meta Question: What question should the tourist ask

• The form of the question• Very difficult: needs human intelligence• The tourist should ask

– Is R true?– The answer is “yes” if and only if the left road

leads to the capital– The structure of R to be found as a function of P

and Q

A more mechanical part: use of truth table

P Q S’s Answer

R

T T Yes T

T F Yes F

F T No F

F F No T

Get form of R: quite mechanical

• From the truth table– R is of the form (P x-nor Q) or (P ≡ Q)

Get R in English/Hindi/Hebrew…

• Natural Language Generation: non-trivial• The question the tourist will ask is

– Is it true that the left road leads to the capital if and only if you speak the truth?

• Exercise: A more well known form of this question asked by the tourist uses the X-OR operator instead of the X-Nor. What changes do you have to incorporate to the solution, to get that answer?