5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.

27
03/17/22 1 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn

Transcript of 5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.

04/18/23 1

Probabilistic Reasoning

CIS 479/579

Bruce R. Maxim

UM-Dearborn

04/18/23 2

Uncertainty

• Dealing with incomplete and uncertain data is an important part of many AI systems

• Approaches– Ad Hoc Uncertainty Factors– Classical Probability Theory– Fuzzy Set Theory– Dempster Shaffer Theory of Evidence

04/18/23 3

Using Probabilistic Reasoning

• Relevant world or domain is random– more knowledge does not allow us to describe

situation more precisely

• Relevant domain is not random, rarely have access to enough data– experiments too costly or too dangerous

• Domain is not random, just not described in sufficient detail– need to get more knowledge into the system

04/18/23 4

Certainty Factor Questions

• How are certainties associated with rule inputs (e.g. antecedents)?

• How does the rule translate input certainty to output certainty (i.e. how deterministic is the rule?)

• How do you determine the certainty of facts supported by several rules?

04/18/23 5

Ad Hoc Approach

• Minimum of value on the interval [0,1] associated with each rule antecedent is a rule’s input certainty

• Assume some attenuation or deterministic rule will be used as a multiplier to map input certainty to output certainty

• When several rules supporting the same fact the maximum if the rule output certainties will be the overall certainty of the fact

04/18/23 6

Ad Hoc Example

04/18/23 7

Ad Hoc Example

• Rule A1 translates input to output(0.9) * (1.0) = 1.0

• Rule A2 translates input to output(0.25) * (1.0) = 0.25

• Fact supported by A1 and A2max(0.25, 0.9) = 0.9

• Input to Rule A7min(0.9, 0.25) = 0.25

• Rule A7 translates input to output(0.25) * (0.8) = (0.2)

04/18/23 8

Probability Axioms

P(E) = Number of desired outcomes

Total number of outcomes

= | event | / |sample space|

P(not E) = P(~E)

= 1 – P(E)

04/18/23 9

Additive Laws

P(A or B) = P(A B)

= P(A) + P(B) – P(A B)

If A and B are mutually exclusive

A B =

P(A B ) = 0

P(A or B) = P(A) + P(B)

04/18/23 10

Multiplicative Laws

P(A and B) = P(A B)

= P(A) * P(B|A)

= P(B) * P(A|B)

For independent events

P(B|A) = P(B)

P(A|B) = P(A)

P(A B) = P(A) * P(B)

04/18/23 11

Bayes Theorem

P(Hi |E) Probability Hi is true given evidence E

P(E | Hi) Probability E is observed given Hi

P(Hi) = Hi true regardless of evidence

P(Hi |E) = P(E | Hi) * P(Hi) = P(E | Hi) * P(Hi)

k

P(E | Hk) * P(Hk) P(E)

n=1

04/18/23 12

Bayes Example

• Prior Probability it will rainP(H) = 0.8

• Conditional probabilities: Geese on the lake, given rain tomorrow

P(E|H) = 0.2

Geese on lake, with no rain tomorrow

P(E | ~H) = 0.025

04/18/23 13

Bayes Example

• EvidenceP(E) = P(E | H) * P(H) + P(E | ~H) * P(~H)

= (0.02)*(0.8) + (0.025)*(0.2) = (0.016) + (0.005) = 0.021

• Posterior probabilityRain given geese on lake

P(H | E) = (P( E | H) * P(H)) / P(E) = (0.016 / 0.021) = 0.7619

04/18/23 14

Bayes Example

• Posterior probability

No rain given geese on lake

P(~H | E) = (P( E | ~H) * P(~H)) / P(E)

= (0.005 / 0.021)

= 0.2381

04/18/23 15

Weakness of Bayes Approach

• Difficult to get all apriori conditional and joint probabilities required

• Database of priorities is hard to modify because of large number of interactions

• Lots of calculations required• Outcomes must be disjoint• Accuracy depends on complete

hypothesis

04/18/23 16

Problems Which Can Make Use of Probabilistic Inference

• Information available is of varying certainty or completeness

• Need nearly optimal solutions

• Need to justify decisions in favor of alternate decisions

• General rules of inference are known or can be found for the problem

04/18/23 17

Fuzzy Set Theory

• In ordinary set theory every element “x” from a given universe is either in or out of a set S

x S

x S

• In fuzzy set theory set membership is not so easily determined

04/18/23 18

When is a pile of chalk big?

• If we have three pieces of chalk in the room is that considered a big pile of chalk?

• Some people might say, yes that is a big pile and some would not.

• Someplace between those three pieces of chalk and a whole room full of chalk the pile of chalk turns from a small pile into a big pile.

• This could be a different spot for different people.

04/18/23 19

Membership Function

F:[0,1]n [0,1]

x S f(x)

x S 1/x

04/18/23 20

  Possibilistic Logic Dependent

Events

Probabilistic Logic Independent

Events

A a a

B b b

not A 1-a 1 - a

A and B min(a,b) a * b

A or B max(a,b) a + b – a*b

A Bnot A or (A and B)

max(1 - a, b) (1 - a) + a * b

A xor B max(min(a, 1 - b), min(1 - a, b))

a+b-2ab+a2b+ab2-a2b2

04/18/23 21

Possibilistic Example

AssumeP(X) = 0.5, P(Y) = 0.1, P(Z) = 0.2

Determine P(X (Y or Z))P(Y or Z) = max(P(Y), P(Z)) = max(0.1, 0.2) = 0.2P(X (Y or Z)) = max(1 – P(X), P(Y or Z)) = max(1 – 0.5, 0.2) = max(0.5, 0.2) = 0.5

04/18/23 22

Probabilistic Example

AssumeP(X) = 0.5, P(Y) = 0.1, P(Z) = 0.2

Determine P(X (Y or Z))P(Y or Z) = P(Y) + P(Z) – P(Y) * P(Z) = 0.1 + 0.2 – 0.1 * 0.2 = 0.3 – 0.02 = 0.28P(X (Y or Z)) = not P(X) + P(X) * P(Y or Z) = (1 – 0.5) + 0.2 * 0.28) = 0.5 + 0.14 = 0.64

04/18/23 23

Bayesian Inference

04/18/23 24

Bayesian Inference

• SymptomsS1: Clanking SoundS2: Low pickupS3: Starting problemS4: Parts are hard to find

• ConclusionC1: Repair Estimate > $250

04/18/23 25

Bayeisan Inference

• Intermediate HypothesesH1: Thrown connecting rod

H2: Wrist Pin Loose

H3: Car Out of Tune

• Secondary HypothesesH4: Replace or Rebuild Engine

H5: Tune Engine

04/18/23 26

Bayeisan Inference

• These must be known in advanceP(H1), P(H2), P(H3)

P(S | Hi) for i = 1, 2, 3

• Computed using Bayes formulaP(S) = P(Hi) P(S | Hi)

P(Hi | S) for i = 1, 2, 3

04/18/23 27

Bayesian Inference• H4: Replace or Rebuild Engine

P(H4) = P(H1 or H2) = max(P(H1 | S), P(H2 | S))

• H5: Tune EngineP(H5) = not (H1 or H2) and H3 = min(1 – max(P(H1 | S), P(H2 | S)), P(H3))

• C1: Repair Estimate > $250P(C1) = P(H4 or P(H5 and S4)) = max(P(H4 | S), min(P(H5 | S), V)note: V = 1 if S4 is true and 0 otherwise