1CSC 8520 Spring 2010. Paula Matuszek
CS 8520: Artificial Intelligence
Logical Agents and First Order Logic
Paula MatuszekSpring 2010
2CSC 8520 Spring 2010. Paula Matuszek
Outline• Knowledge-based agents• Wumpus world• Logic in general - models and entailment• Propositional (Boolean) logic• First Order Logic• Inference rules and theorem proving
– forward chaining– backward chaining– resolution
3CSC 8520 Spring 2010. Paula Matuszek
Knowledge bases
• Knowledge base = set of sentences in a formal language• Declarative approach to building an agent (or other system):
– Tell it what it needs to know• Then it can Ask itself what to do - answers should follow
from the KB• Agents can be viewed at the knowledge level
– i.e., what they know, regardless of how implemented• Or at the implementation level
– i.e., data structures in KB and algorithms that manipulate them
4CSC 8520 Spring 2010. Paula Matuszek
A simple knowledge-based agent
5CSC 8520 Spring 2010. Paula Matuszek
Simple Knowledge-Based Agent
• This agent tells the KB what it sees, asks the KB what to do, tells the KB what it has done (or is about to do).
• The agent must be able to:– Represent states, actions, etc.– Incorporate new percepts– Update internal representations of the world– Deduce hidden properties of the world– Deduce appropriate actions
6CSC 8520 Spring 2010. Paula Matuszek
Wumpus World PEAS Description• Performance measure
– gold +1000, death -1000– -1 per step, -10 for using the arrow
• Environment– Squares adjacent to wumpus are smelly– Squares adjacent to pit are breezy– Glitter iff gold is in the same square– Shooting kills wumpus if you are facing it– Shooting uses up the only arrow– Grabbing picks up gold if in same square– Releasing drops the gold in same square
• Sensors: Stench, Breeze, Glitter, Bump, Scream• Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot
7CSC 8520 Spring 2010. Paula Matuszek
Wumpus world characterization• Fully Observable? No – only local perception• Deterministic? Yes – outcomes exactly
specified• Static? Yes – Wumpus and Pits do not move• Discrete? Yes• Episodic? No – sequential at the level of
actions• Single-agent? Yes – The wumpus itself is
essentially a natural feature, not another agent
8CSC 8520 Spring 2010. Paula Matuszek
Exploring a Wumpus worldDirectly observed:S: stenchB: breezeG: glitterA: agent Inferred (mostly):OK: safe squareP: pitW: wumpus
9CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
In 1,1 we don’t get B or S, so we know 1,2 and 2,1 are safe. Move to 1,2.
In 1,2 we feel a breeze.
10
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
In 1,2 we feel a breeze. So we know there is a pit in 1,3 or 2,2.
11
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
So go back to 1,1 then to 2,1 where we smell a stench.
12
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
W
W
S
Stench in 2,1, so the wumpus is in 3,1 or 2,2.
We don't smell a stench in 1,2, so 2,2 can't be the wumpus, so 3,1 must be the wumpus.
13
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
We don't feel a breeze in 2,1, so 2,2 can't be a pit, so 1,3 must be a pit.
2,2 has neither pit nor wumpus and is therefore okay.
14
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
We move to 2,2. We don’t get any sensory input.
15
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus world
So we know that 2,3 and 3,2 are ok.
16
CSC 8520 Spring 2010. Paula Matuszek
Exploring a wumpus worldMove to 3,2, where we observe stench, breeze and glitter!
We have found the gold and won.
Do we want to kill the wumpus? (hint: look at the scoring function)
17
CSC 8520 Spring 2010. Paula Matuszek
Representing Knowledge• The agent that solves the wumpus world can
most effectively be implemented by a knowledge-based approach
• Need to represent states and actions, update internal representations, deduce hidden properties and appropriate actions
• Need a formal representation for the KB• And a way to reason about that
representation
18
CSC 8520 Spring 2010. Paula Matuszek
Logic in general• Logics are formal languages for representing information
such that conclusions can be drawn• Syntax defines the sentences in the language• Semantics define the "meaning" of sentences;
– i.e., define truth of a sentence in a world• E.g., the language of arithmetic• x+2 ≥ y is a sentence; x2+y > {} is not a sentence
– x+2 ≥ y is true iff the number x+2 is no less than the number y.
– x+2 ≥ y is true in a world where x = 7, y = 1– x+2 ≥ y is false in a world where x = 0, y = 6
19
CSC 8520 Spring 2010. Paula Matuszek
Entailment• Entailment means that one thing follows from
another. • Knowledge base KB entails sentence S if and
only if S is true in all worlds where KB is true– E.g., the KB containing “the Giants won” and “the
Reds won” entails “Either the Giants won or the Reds won”
– E.g., x+y = 4 entails 4 = x+y– Entailment is a relationship between sentences
(i.e., syntax) that is based on semantics
20
CSC 8520 Spring 2010. Paula Matuszek
Entailment in the wumpus world• Situation after detecting
nothing in [1,1], moving right, breeze in [2,1]
• Consider possible models for KB assuming only pits: there are 3 Boolean choices 8 possible ⇒models (ignoring sensory data)
21
CSC 8520 Spring 2010. Paula Matuszek
Wumpus models for pits
22
CSC 8520 Spring 2010. Paula Matuszek
Wumpus models• KB = wumpus-world rules + observations.
Only the three models in red are consistent with KB.
23
CSC 8520 Spring 2010. Paula Matuszek
Wumpus ModelsKB plus hypothesis S1 that pit is not in 1.2. KB is solid red boundary. S1 is dotted yellow boundary. KB is contained within S1, so KB entails S; in every model in which KB is true, so is S. We can conclude that the pit is not in1.2.KB plus hypothesis S2 that pit is not in 2.2 . KB is solid red boundary. S2 is dotted brown boundary. KB is not within S1, so KB does not entail S2; nor does S2 entail KB. So we can't conclude anything about the truth of S2 given KB.
24
CSC 8520 Spring 2010. Paula Matuszek
Inference• KB entailsi S means sentence S can be derived
from KB by procedure i• Soundness: i is sound if it derives only sentences
S that are entailed by KB• Completeness: i is complete if it derives all
sentences S that are entailed by KB.• Logic:
– Has a sound and complete inference procedure– Which will answer any question whose answer follows
from what is known by the KB.
25
CSC 8520 Spring 2010. Paula Matuszek
Propositional logic: Syntax• Propositional logic is the simplest logic – illustrates basic
ideas• The proposition symbols P1, P2 etc are sentences
– If S is a sentence, ¬S is a sentence (negation)– If S1 and S2 are sentences, S1 S2 is a sentence (conjunction)∧– If S1 and S2 are sentences, S1 S2 is a sentence (disjunction)∨– If S1 and S2 are sentences, S1 S2 is a sentence (implication)⇒– If S1 and S2 are sentences, S1 ⇔ S2 is a sentence
(biconditional)
26
CSC 8520 Spring 2010. Paula Matuszek
Grounding• We must also be aware of the issue of grounding:
the connection between our KB and the real world.– Straightforward for Wumpus or our adventure games– Much more difficult if we are reasoning about real
situations– Real problems seldom perfectly grounded, because
we must ignore details. – Is the connection good enough to get useful answers?
27
CSC 8520 Spring 2010. Paula Matuszek
Wumpus world sentences• Let Pi,j be true if there is a pit in [i, j].• Let Bi,j be true if there is a breeze in [i, j].
– ¬ P1,1
– ¬B1,1
– B2,1
• "Pits cause breezes in adjacent squares"– B1,1 ⇔ (P1,2 P∨ 2,1)– B2,1 ⇔ (P1,1 P∨ 2,2 P∨ 3,1)
28
CSC 8520 Spring 2010. Paula Matuszek
Inference• Given some sentences how do we get to
new sentences?• Inference! Preferably sound and complete
inference. • Example above is model-based inference
– enumerate all possible models– check to see if a proposed sentence is true in
every KB – sound, complete, slow
29
CSC 8520 Spring 2010. Paula Matuszek
Inference by enumeration• Depth-first enumeration of all models is sound and complete• n symbols, time complexity is O(2n), space complexity is O(n)
30
CSC 8520 Spring 2010. Paula Matuszek
Inference-based agents in the wumpus world
• A wumpus-world agent using propositional logic:– ¬P1,1 – ¬W1,1 – Bx,y ⇔ (Px,y+1 Px,y-1 Px+1,y Px-1,y) ∨ ∨ ∨– Sx,y ⇔ (Wx,y+1 Wx,y-1 Wx+1,y Wx-1,y)∨ ∨ ∨– W1,1 W1,2 … W4,4 ∨ ∨ ∨– ¬W1,1 ¬W1,2 ∨– ¬W1,1 ¬W1,3 ∨– …
• ⇒ 64 distinct proposition symbols, 155 sentences
31
CSC 8520 Spring 2010. Paula Matuszek
32
CSC 8520 Spring 2010. Paula Matuszek
Inference by Theorem Proving• Rather than enumerate all possible models,
we can apply inference rules to the current sentences in the KB to derive new sentences
• A proof consists of a chain of rules beginning with known sentences (premises) and ending with the sentence we want to prove (conclusion)
33
CSC 8520 Spring 2010. Paula Matuszek
Inference rules in propositional logic
• Here are just a few of the rules you can apply when reasoning in propositional logic:
From aima.eecs.berkeley.edu/slides-ppt, chs 7-9
34
CSC 8520 Spring 2010. Paula Matuszek
Implication elimination• A particularly important rule allows you to get
rid of the implication operator, ⇒ :X ⇒ Y ≡ X ∨ YWeds ⇒ AI class ≡ AI class ∨ Weds• The symbol ≡ means “is logically
equivalent to”• We will use this later on as a necessary tool for
simplifying logical expressions
35
CSC 8520 Spring 2010. Paula Matuszek
Conjunction elimination• Another important rule for simplifying logical
expressions allows you to get rid of the conjunction (and) operator, ∧ :
• This rule simply says that if you have an and operator at the top level of a fact (logical expression), you can break the expression up into two separate facts:– Breeze2,1 ∧ Breeze1,2
• becomes:– Breeze2,1
– Breeze1,2
36
CSC 8520 Spring 2010. Paula Matuszek
Inference by Theorem Proving• Given:
– A set of sentences in the KB: the premises– A sentence to be proved: the conclusion– A set of sound rules
• Apply a rule to an appropriate sentence• Add the resulting sentence to the KB• Stop if we have added the conclusion to the KB• Better than model-based if many models, short
proofs
37
CSC 8520 Spring 2010. Paula Matuszek
Pros of propositional logic• Declarative• Allows partial/disjunctive/negated information • Compositional: meaning of B1,1 P∧ 1,2 is
derived from meaning of B1,1 and of P1,2
• Meaning in propositional logic is context-independent (unlike natural language, where meaning depends on context)
• Has a sound, complete inference procedure
38
CSC 8520 Spring 2010. Paula Matuszek
Cons of Propositional Logic• Very limited expressive power (unlike natural
language)• - Rapid proliferation of clauses.
– For instance, Wumpus KB contains "physics" sentences for every single square E.g., cannot say "pits cause breezes in adjacent squares“ except by writing one sentence for each square
• Inference is very bushy; not feasible for any but very small cases
39
CSC 8520 Spring 2010. Paula Matuszek
Summary• Logical agents apply inference to a knowledge base to
derive new information and make decisions• Basic concepts of logic:
– syntax: formal structure of sentences– semantics: truth of sentences wrt models– entailment: necessary truth of one sentence given another– inference: deriving sentences from other sentences– soundness: derivations produce only entailed sentences– completeness: derivations can produce all entailed sentences
• Propositional logic lacks expressive power
40
CSC 8520 Spring 2010. Paula Matuszek
First-order logic• Whereas propositional logic assumes the world
contains facts,• first-order logic (like natural language) assumes
the world contains– Objects: people, houses, numbers, colors, baseball
games, wars, …– Relations: red, round, prime, brother of, bigger
than, part of, comes between, …– Functions: father of, best friend, one more than,
plus, …
41
CSC 8520 Spring 2010. Paula Matuszek
Syntax of FOL: Basic elements• Constants KingJohn, 2, Villanova,... • Predicates Brother, >,...• Functions Sqrt, LeftLegOf,...• Variables x, y, a, b,...• Connectives ¬, , , , ⇔⇒ ∧ ∨• Equality = • Quantifiers , ∀ ∃
42
CSC 8520 Spring 2010. Paula Matuszek
Constants, functions, and predicates• A constant represents a “thing”--it has no truth value,
and it does not occur “bare” in a logical expression– Examples: PaulaMatuszek, 5, Earth, goodIdea
• Given zero or more arguments, a function produces a constant as its value:– Examples: studentof(PaulaMatuszek), add(2, 2),
thisPlanet()• A predicate is like a function, but produces a truth
value– Examples: aiInstructor(PaulaMatuszek),
isPlanet(Earth), greater(3, add(2, 2))
43
CSC 8520 Spring 2010. Paula Matuszek
Universal quantification• The universal quantifier, ∀, is read as “for each”
or “for every”– Example: ∀x, x2 ≥ 0 (for all x, x2 is greater than or equal to
zero)• Typically, ⇒ is the main connective with ∀:
∀x, at(x,Villanova) ⇒ smart(x)means “Everyone at Villanova is smart”
• Common mistake: using ∧ as the main connective with ∀:∀x, at(x,Villanova) ∧ smart(x)means “Everyone is at Villanova and everyone is smart”
• If there are no values satisfying the condition, the result is true– Example: ∀x, isPersonFromMars(x) ⇒ smart(x) is true
44
CSC 8520 Spring 2010. Paula Matuszek
Existential quantification• The existential quantifier, ∃, is read “for some” or
“there exists”– E.G.: ∃x, x2 < 0 (there exists an x such that x2 < zero)
• Typically, ∧ is the main connective with ∃:∃x, at(x,Villanova) ∧ smart(x)means “There is someone who is at Villanova and is smart”
• Common mistake: using ⇒ as the main connective with ∃:∃x, at(x,Villanova) ⇒ smart(x)
This is true if there is someone at Villanova who is smart......but it is also true if there is someone who is not at Villanova
By the rules of material implication, the result of F ⇒ T is T
45
CSC 8520 Spring 2010. Paula Matuszek
Properties of quantifiers• ∀x ∀y is the same as ∀y ∀x• ∃x ∃y is the same as ∃y ∃x• ∃x ∀y is not the same as ∀y ∃x• ∃x ∀y Loves(x,y)
– “There is a person who loves everyone in the world”– More exactly: ∃x ∀y (person(x) ∧ person(y) ⇒
Loves(x,y))• ∀y ∃x Loves(x,y)
– “Everyone in the world is loved by at least one person”
From aima.eecs.berkeley.edu/slides-ppt, chs 7-9
46
CSC 8520 Spring 2010. Paula Matuszek
More rules• Quantifier duality: each can be expressed
using the other∀x Likes(x,IceCream) ∃x Likes(x,IceCream)∃x Likes(x,Broccoli) ∀x Likes(x,Broccoli)
• Two additional important rules: ∀x, p(x) ⇒ ∃x, p(x)“If not every x satisfies p(x), then there exists a x that does not satisfy p(x)” ∃x, p(x) ⇒ ∀x, p(x)“If there does not exist an x that satisfies p(x), then all x do not satisfy p(x)
47
CSC 8520 Spring 2010. Paula Matuszek
Terms and Atomic sentences• A term is a logical expression that refers to an
object.– Book(Andrew). Andrew’s book.– Textbook(8520). Textbook for 8520.
• An atomic sentence states a fact.– Student(Andrew).– Student(Andrew, Paula).– Student(Andrew, AI).– Note that the interpretation of these is different; it
depends on how we consider them to be grounded.
48
CSC 8520 Spring 2010. Paula Matuszek
Complex sentences• Complex sentences are made from atomic
sentences using connectives• ¬S, S1 S2, S1 S2, S1 S2, S1 ⇔ S2,∧ ∨ ⇒
• E.g. Sibling(KingJohn,Richard) ⇒Sibling(Richard,KingJohn)
49
CSC 8520 Spring 2010. Paula Matuszek
Truth in first-order logic• Sentences are true with respect to a model and an
interpretation• Model contains objects (domain elements) and relations
among them• Interpretation specifies referents for
– constant symbols → objects– predicate symbols → relations– function symbols → functional relations
• An atomic sentence predicate(term1,...,termn) is true iff the objects referred to by term1,...,termn are in the relation referred to by predicate
50
CSC 8520 Spring 2010. Paula Matuszek
Knowledge base for the wumpus world• Perception
– ∀t,s,b Percept([s,b,Glitter],t) Glitter(t)⇒
• Reflex– ∀t Glitter(t) BestAction(Grab,t)⇒
51
CSC 8520 Spring 2010. Paula Matuszek
Deducing hidden properties• ∀x,y,a,b Adjacent([x,y],[a,b]) ⇔ (x=a (y=b-1 y=b+1) (y=b (x=a-1 x=a+1))∧ ∨ ∨ ∧ ∨• Properties of squares:
– ∀s,t At(Agent,s,t) Breeze(t) Breezy(s)∧ ⇒– Squares are breezy near a pit:
• Diagnostic rule---infer cause from effect• ∀s Breezy(s) ⇔ r Adjacent(r,s) Pit(r) ∃ ∧
• Causal rule---infer effect from cause• ∀r Pit(r) [ s Adjacent(r,s) Breezy(s)]⇒ ∀ ⇒
52
CSC 8520 Spring 2010. Paula Matuszek
Knowledge engineering in FOL• Identify the task• Assemble the relevant knowledge: knowledge
acquisition• Decide on a vocabulary of predicates, functions, and
constants: an ontology• Encode general knowledge about the domain• Encode a description of the specific problem instance• Pose queries to the inference procedure and get
answers• Debug the knowledge base
53
CSC 8520 Spring 2010. Paula Matuszek
Summary• First-order logic:
– objects and relations are semantic primitives– syntax: constants, functions, predicates,
equality, quantifiers• Increased expressive power: sufficient to
define wumpus world compactly
54
CSC 8520 Spring 2010. Paula Matuszek
Inference• When we have all this knowledge we want to
do something with it• Typically, we want to infer new knowledge
– An appropriate action to take– Additional information for the Knowledge Base
• Some typical forms of inference include– Forward chaining– Backward chaining– Resolution
55
CSC 8520 Spring 2010. Paula Matuszek
Example knowledge base• The law says that it is a crime for an
American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American.
• Prove that Col. West is a criminal
56
CSC 8520 Spring 2010. Paula Matuszek
Example knowledge base contd.... it is a crime for an American to sell weapons to hostile nations:American(x) Weapon(y) Sells(x,y,z) Hostile(z) Criminal(x)∧ ∧ ∧ ⇒Nono … has some missiles: Owns(Nono,M1) Missile(M1)… all of its missiles were sold to it by Colonel WestMissile(x) Owns(Nono,x) Sells(West,x,Nono)∧ ⇒Missiles are weapons: Missile(x) Weapon(x)⇒An enemy of America counts as "hostile“:Enemy(x,America) Hostile(x)⇒West, who is American: American(West)The country Nono, an enemy of America: Enemy(Nono,America)
57
CSC 8520 Spring 2010. Paula Matuszek
Forward chaining proof
58
CSC 8520 Spring 2010. Paula Matuszek
Forward chaining proof
59
CSC 8520 Spring 2010. Paula Matuszek
Forward chaining proof
60
CSC 8520 Spring 2010. Paula Matuszek
Properties of forward chaining• Sound and complete for first-order definite
clauses• Datalog = first-order definite clauses + no
functions• FC terminates for Datalog in finite number of
iterations• May not terminate in general if α is not entailed• This is unavoidable: entailment with definite
clauses is semidecidable
61
CSC 8520 Spring 2010. Paula Matuszek
Efficiency of forward chaining• Incremental forward chaining: no need to match a rule
on iteration k if a premise wasn't added on iteration k-1– ⇒ match each rule whose premise contains a newly added
positive literal• Matching itself can be expensive:• Database indexing allows O(1) retrieval of known facts
– e.g., query Missile(x) retrieves Missile(M1)• Forward chaining is widely used in deductive databases
62
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
63
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
64
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
65
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
66
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
67
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
68
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
69
CSC 8520 Spring 2010. Paula Matuszek
Backward chaining example
70
CSC 8520 Spring 2010. Paula Matuszek
Properties of backward chaining• Depth-first recursive proof search: space is
linear in size of proof• Incomplete due to infinite loops
– ⇒ fix by checking current goal against every goal on stack
• Inefficient due to repeated subgoals (both success and failure)– ⇒ fix using caching of previous results (extra space)
• Widely used for logic programming
71
CSC 8520 Spring 2010. Paula Matuszek
Forward vs. backward chaining• FC is data-driven, automatic, unconscious processing,
– e.g., object recognition, routine decisions• May do lots of work that is irrelevant to the goal • BC is goal-driven, appropriate for problem-solving,
– e.g., Where are my keys? How do I get into a PhD program?• Complexity of BC can be much less than linear in size
of KB• Choice may depend on whether you are likely to have
many goals or lots of data.
72
CSC 8520 Spring 2010. Paula Matuszek
Resolution
73
CSC 8520 Spring 2010. Paula Matuszek
Inference is Expensive!• You start with a large collection of facts (predicates)• and a large collection of possible transformations (rules)
– Some of these rules apply to a single fact to yield a new fact– Some of these rules apply to a pair of facts to yield a new
fact• So at every step you must:
– Choose some rule to apply– Choose one or two facts to which you might be able to apply
the rule• If there are n facts
– There are n potential ways to apply a single-operand rule– There are n * (n - 1) potential ways to apply a two-operand rule
– Add the new fact to your ever-expanding fact base• The search space is huge!
74
CSC 8520 Spring 2010. Paula Matuszek
The magic of resolution• Here’s how resolution works:
– You transform each of your facts into a particular form, called a clause
– You apply a single rule, the resolution principle, to a pair of clauses
• Clauses are closed with respect to resolution--that is, when you resolve two clauses, you get a new clause
• You add the new clause to your fact base
• So the number of facts you have grows linearly– You still have to choose a pair of facts to resolve– You never have to choose a rule, because there’s only one
75
CSC 8520 Spring 2010. Paula Matuszek
The fact base• A fact base is a collection of “facts,” expressed in predicate
calculus, that are presumed to be true (valid)• These facts are implicitly “anded” together• Example fact base:
– seafood(X) ⇒ likes(John, X) (where X is a variable)– seafood(shrimp)– pasta(X) ⇒ likes(Mary, X) (where X is a different variable)– pasta(spaghetti)
• That is,– (seafood(X) ⇒ likes(John, X)) ∧ seafood(shrimp) ∧
(pasta(Y) ⇒ likes(Mary, Y)) ∧ pasta(spaghetti)– Notice that we had to change some Xs to Ys– The scope of a variable is the single fact in which it occurs
76
CSC 8520 Spring 2010. Paula Matuszek
Clause form• A clause is a disjunction ("or") of zero or more literals, some or
all of which may be negated • Example:
sinks(X) ∨ dissolves(X, water) ∨ ¬denser(X, water)• Notice that clauses use only “or” and “not”—they do not use
“and,” “implies,” or either of the quantifiers “for all” or “there exists”
• The impressive part is that any predicate calculus expression can be put into clause form– Existential quantifiers, ∃, are the trickiest ones
• And any FOL expression can be expressed as a conjunction of clauses: Conjunctive Normal Form
77
CSC 8520 Spring 2010. Paula Matuszek
Unification• From the pair of facts (not yet clauses, just facts):
– seafood(X) ⇒ likes(John, X) (where X is a variable)– seafood(shrimp)
• We ought to be able to conclude– likes(John, shrimp)
• We can do this by unifying the variable X with the constant shrimp– This is the same “unification” as is done in Prolog
• This unification turns seafood(X) ⇒ likes(John, X) into seafood(shrimp) ⇒ likes(John, shrimp)
• Together with the given fact seafood(shrimp), the final deductive step is easy
78
CSC 8520 Spring 2010. Paula Matuszek
The resolution principle• Here it is:
– From X ∨ someLiteralsand X ∨ someOtherLiterals ----------------------------------------------conclude: someLiterals ∨ someOtherLiterals
• That’s all there is to it! X and X are complementary literals; someLiterals ∨ someOtherLiterals is the resolvent clause
• Example:– broke(Bob) ∨ well-fed(Bob)
¬broke(Bob) ∨ ¬hungry(Bob) --------------------------------------well-fed(Bob) ∨ ¬hungry(Bob)
79
CSC 8520 Spring 2010. Paula Matuszek
A common error• You can only do one resolution at a time• Example:
– broke(Bob) ∨ well-fed(Bob) ∨ happy(Bob)¬broke(Bob) ∨ ¬hungry(Bob) ∨ ¬happy(Bob)
• You can resolve on broke to get:– well-fed(Bob) ∨ happy(Bob) ∨ ¬hungry(Bob) ∨ ¬happy(Bob) ≡
T• Or you can resolve on happy to get:
– broke(Bob) ∨ well-fed(Bob) ∨ ¬broke(Bob) ∨ ¬hungry(Bob) ≡ T
• Note that both legal resolutions yield a tautology (a trivially true statement, containing X ∨ ¬X), which is correct but useless
• But you cannot resolve on both at once to get:– well-fed(Bob) ∨ ¬hungry(Bob)
80
CSC 8520 Spring 2010. Paula Matuszek
Contradiction• A special case occurs when the result of a
resolution (the resolvent) is empty, or “NIL”• Example:
– hungry(Bob)¬hungry(Bob)----------------NIL
• In this case, the fact base is inconsistent• This will turn out to be a very useful observation
in doing resolution theorem proving
81
CSC 8520 Spring 2010. Paula Matuszek
A first example• “Everywhere that John goes, Rover goes. John is at
school.”– at(John, X) ⇒ at(Rover, X) (not yet in clause form)– at(John, school) (already in clause form)
• We use implication elimination to change the first of these into clause form:– at(John, X) ∨ at(Rover, X)– at(John, school)
• We can resolve these on at(-, -), but to do so we have to unify X with school; this gives:– at(Rover, school)
82
CSC 8520 Spring 2010. Paula Matuszek
Refutation resolution• The previous example was easy because it had very few
clauses• When we have a lot of clauses, we want to focus our
search on the thing we would like to prove• We can do this as follows:
– Assume that our fact base is consistent (we can’t derive NIL)– Add the negation of the thing we want to prove to the fact
base– Show that the fact base is now inconsistent– Conclude the thing we want to prove
83
CSC 8520 Spring 2010. Paula Matuszek
Example of refutation resolution “Everywhere that John goes, Rover goes. John is at school.
Prove that Rover is at school.”• at(John, X) ∨ at(Rover, X)• at(John, school)• at(Rover, school) (this is the added clause)
Resolve #1 and #3:• at(John, X)
Resolve #2 and #4:• NIL
Conclude the negation of the added clause: at(Rover, school)
This seems a roundabout approach for such a simple example, but it works well for larger problems
84
CSC 8520 Spring 2010. Paula Matuszek
A second example Start with:
it_is_raining ∨ it_is_sunny it_is_sunny ⇒ I_stay_dry it_is_raining ⇒ I_take_umbrella I_take_umbrella ⇒ I_stay_dry
Convert to clause form:– it_is_raining ∨ it_is_sunny– it_is_sunny ∨ I_stay_dry– it_is_raining ∨
I_take_umbrella– I_take_umbrella ∨ I_stay_dry
Prove that I stay dry:– I_stay_dry
Proof:6. (5, 2) it_is_sunny7. (6, 1) it_is_raining8. (5, 4) I_take_umbrella9. (8, 3) it_is_raining10. (9, 7) NIL Therefore,
(I_stay_dry)6. I_stay_dry
85
CSC 8520 Spring 2010. Paula Matuszek
Resolution Summary• Resolution is a single inference role which is both sound
and complete• Every sentence in the KB is represented in clause form;
any sentence in FOL can be reduced to this clause form.• Two sentences can be resolved if one contains a positive
literal and the other contains a matching negative literal; the result is a new fact which is thedisjunction of the remaining literals in both.
• Resolution can be used as a relatively efficient theorem-prover by adding to the KB the negative of the sentence to be proved and attempting to derive a contradiction
86
CSC 8520 Spring 2010. Paula Matuszek
Summary• Inference is the process of adding
information to the knowledge base• We want inference to be sound: what we
add is true if the KB is true• And complete: if the KB entails a sentence
we can derive it.• Forward chaining, backward chaining, and
resolution are typical inference mechanisms for first order logic.
Top Related