Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and...

8
Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial Intelligence, Cognitive Modeling andInteraction" Viale Marx, 15 - 00137 ROMA - Italy E-mail: {cris, falcone}@pscs2.irmkant.rm.cnr.it Introduction The huge majority of DAIand MA, CSCW and negotiation systems, communication protocols, cooperative software agents, etc. are based on the idea that cooperation works throughthe allocation of some task (or sub-task) of a given agent (individual or complex) to another agent, via some "request" (offer, proposal, announcement,etc.) meeting some"commitment" (bid, contract, adoption, etc.). This core constituent of any interactive, negotial, cooperative system is not so clear, well founded and systematically studied as it could seem. Our claim is that any support system for cooperation and any theory of cooperation require an analytic theoly of delegation and adoption. We will contribute to an important aspect of this theoo, with a plan-based analysis of delegation. In this paper we try to propose a foundation of the various levels of delegation and adoption(help), characterizing their basic principles and representations. We try also to identify different agent modelingrequirements in relation to the different levels of delegation and/or adoption. We characterize the various levels of the delegation- adoption relation (executive or open; implicit or explicit; on the domainor on the planning; etc.) on the basis of theory of plans, actions and agents. Our claim is that each level of task delegation requires specific beliefs (modeling) about both the delegate and the delegee. Delegation, adoption and their meeting In this section wesupply a general definition of delegation, adoption, and contract, before entering into a more formal and detailed ,analysis of these concepts. Let A and B be two agents. There are two main forms of delegation: - A delegates to B a result ? (goal state): i.e. A delegates B to "bring it about that g", where "to bring it about that g" means to find and execute an action that has g mnong its relevant results/effects. Sub-delegation is not excluded. Delegation from A does not require that A knows which is the action that B has to c,’uxy out: A has only to guess that there is such an action. -A delegates to B an action a, i.e. A delegates B to perform(or sub-delegate) We assume that, to delegate an action necessarily implies to delegate some result of that action [postulate I]. Conversely, to delegate a result alwa~,s implies the delegation of at least one action that produces such a result [postulate II]. Thus, in the following wewill consider as the object of the delegation the couple action/goM x=(a,g) that wecall task. Withx, wewill refer to the action, to its resulting world state, or to both: this is because a or g might be implicit or non specified in the request. Bydefinition, a task is a piece/part of a plan (possibly the entire plan); therefore the task has the s,’une hierarchical structure of composition and of abstraction of plans. WeakDelegation ("to rely on", "to exploit") Given two agents A and B, and a task x, to assert that A weakly-d~legates x to B means that: la) (A believes that) isa goal or subgoal of A; that implies [1] that: - A believes that (to perform) x is possible; - A believes that (to perform) x is preferable; - A believes that Not (performed) lb) A believes that B is able to perform/bringit about that x; lc) (A believes that) A has the goal that B performs/brings it aboutthat x; ld) A believes that B will performz in time (or A believes that B is internally committed to performx in time). le) A has the goal (’relativized’ to ld) of not performing z by itself. In WeakDelegation A exploits B’s activities while B might be unaware of this. WeakAdoption ("to take care of") Given two agents A and B, and a task x, to assert that B weakly-adopts ’r for A means tlmt: 2a) B believes that z is a goal or subgoalof A; 2b) B believes that B is able to perform/bringit about that 2c) (B believes that) B has the goal to performzfor A; that hnplies that: - B believes that (to perform) ’r is possible; - B believes that (to perform) x is preferable for - B believes that Not(performed)x 2d) B believes that B will performT in time (or B believes that B is internally committed to performx in time). 2e) B believes that A will not perform z by itself. Notice that this help can be completely unilateral and spontaneous from B (without any request of A), and/or even ignored by A. Delegation.Adoption (Contract) In Strict Delegation, the delegate knows that the delegee is relying on him and accepts the task; in Strict Adoption, the helped agent knows about the adoption and accepts it. In other words, Suict Delegation requires Strict Adoption, and vice versa: they are two facets of a unitary social relation that we will call "delegation-adoption" or "contract". Given two agents A (the client) and B (the contractor), a task x, to assert that there is a delegation-adoption relationship betweenA and B for x, means that: (la) (2a) (lb) (2b) (lc) (2c) (ld) (2d) (le) (2e). 3a) A and B believe that the other agent believe that x is goal or subgo,’dof A; 3b) A and B believe that the other agent believes that B is able to perform/bring it about that x ; 3f) A and B believe that A’s goal is that B performs x for A; 3g) A and B believe that B is socially committed with A to performx for A [2]; 3h) A is socially co~mnitted with B to not perfonning x by hhnself; 3i) A and B mutually believe about their reciprocal commitments. From: AAAI Technical Report WS-96-02. Compilation copyright © 1996, AAAI (www.aaai.org). All rights reserved.

Transcript of Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and...

Page 1: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

Levels of help, levels of delegation and agent modeling

Cristiano Castelfranchi, Rind Falcone

IP-CNR, Group of "Artificial Intelligence, Cognitive Modeling and Interaction"Viale Marx, 15 - 00137 ROMA - Italy

E-mail: {cris, falcone}@pscs2.irmkant.rm.cnr.it

IntroductionThe huge majority of DAI and MA, CSCW and negotiationsystems, communication protocols, cooperative softwareagents, etc. are based on the idea that cooperation worksthrough the allocation of some task (or sub-task) of a givenagent (individual or complex) to another agent, via some"request" (offer, proposal, announcement, etc.) meetingsome "commitment" (bid, contract, adoption, etc.). Thiscore constituent of any interactive, negotial, cooperativesystem is not so clear, well founded and systematicallystudied as it could seem. Our claim is that any supportsystem for cooperation and any theory of cooperationrequire an analytic theoly of delegation and adoption. Wewill contribute to an important aspect of this theoo, with aplan-based analysis of delegation.In this paper we try to propose a foundation of the variouslevels of delegation and adoption (help), characterizing theirbasic principles and representations. We try also to identifydifferent agent modeling requirements in relation to thedifferent levels of delegation and/or adoption.We characterize the various levels of the delegation-adoption relation (executive or open; implicit or explicit;on the domain or on the planning; etc.) on the basis of theory of plans, actions and agents.Our claim is that each level of task delegation requiresspecific beliefs (modeling) about both the delegate and thedelegee.

Delegation, adoption and their meetingIn this section we supply a general definition of delegation,adoption, and contract, before entering into a more formaland detailed ,analysis of these concepts.Let A and B be two agents. There are two main forms ofdelegation:- A delegates to B a result ? (goal state): i.e. A delegates Bto "bring it about that g", where "to bring it about that g"means to find and execute an action that has g mnong itsrelevant results/effects. Sub-delegation is not excluded.Delegation from A does not require that A knows which isthe action that B has to c,’uxy out: A has only to guess thatthere is such an action.-A delegates to B an action a, i.e. A delegates B toperform (or sub-delegate)

We assume that, to delegate an action necessarily impliesto delegate some result of that action [postulate I].Conversely, to delegate a result alwa~,s implies thedelegation of at least one action that produces such a result[postulate II]. Thus, in the following we will consider asthe object of the delegation the couple action/goM x=(a,g)that we call task. With x, we will refer to the action, to itsresulting world state, or to both: this is because a or gmight be implicit or non specified in the request.By definition, a task is a piece/part of a plan (possibly theentire plan); therefore the task has the s,’une hierarchicalstructure of composition and of abstraction of plans.

Weak Delegation ("to rely on", "to exploit")Given two agents A and B, and a task x, to assert that Aweakly-d~legates x to B means that:

la) (A believes that) isa goal or subgoal of A; thatimplies [1] that:

- A believes that (to perform) x is possible;- A believes that (to perform) x is preferable;- A believes that Not (performed)

lb) A believes that B is able to perform/bring it about thatx;lc) (A believes that) A has the goal that B performs/bringsit about that x;ld) A believes that B will perform z in time (or A believesthat B is internally committed to perform x in time).le) A has the goal (’relativized’ to ld) of not performing zby itself.

In Weak Delegation A exploits B’s activities while Bmight be unaware of this.

Weak Adoption ("to take care of")Given two agents A and B, and a task x, to assert that Bweakly-adopts ’r for A means tlmt:2a) B believes that z is a goal or subgoal of A;2b) B believes that B is able to perform/bring it about that

2c) (B believes that) B has the goal to perform zfor A; thathnplies that:

- B believes that (to perform) ’r is possible;- B believes that (to perform) x is preferable for - B believes that Not (performed) x

2d) B believes that B will perform T in time (or B believesthat B is internally committed to perform x in time).2e) B believes that A will not perform z by itself.

Notice that this help can be completely unilateral andspontaneous from B (without any request of A), and/or evenignored by A.

Delegation.Adoption (Contract)In Strict Delegation, the delegate knows that the delegee isrelying on him and accepts the task; in Strict Adoption,the helped agent knows about the adoption and accepts it.In other words, Suict Delegation requires Strict Adoption,and vice versa: they are two facets of a unitary socialrelation that we will call "delegation-adoption" or"contract".Given two agents A (the client) and B (the contractor), a task x, to assert that there is a delegation-adoptionrelationship between A and B for x, means that: (la) (2a)(lb) (2b) (lc) (2c) (ld) (2d) (le) (2e). 3a) A and B believe that the other agent believe that x is goal or subgo,’d of A;3b) A and B believe that the other agent believes that B isable to perform/bring it about that x ;3f) A and B believe that A’s goal is that B performs x forA;3g) A and B believe that B is socially committed with A toperform x for A [2];3h) A is socially co~mnitted with B to not perfonning x byhhnself;3i) A and B mutually believe about their reciprocalcommitments.

From: AAAI Technical Report WS-96-02. Compilation copyright © 1996, AAAI (www.aaai.org). All rights reserved.

Page 2: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

Levels of delegation and adoptionIn the following we will consider only Strict Delegationand Adoption based on implicit or explicit request~offer.

Levels of delegationCorollary_ of cognitive asynun~lry:- in asking for an action, tile client has in mind at least oneof its results (postulate II), while,- in asking for a result, the client might be unaware of anyspecific action needed to achieve it even being aware thathis request is an implicit request of action¯When x is an action a, it can be: an elementary action or acomplex action (plan).The object of delegation/adoption process can be a practicalor domain action as well as a meta-action, that is an actionabout plans, such as searching, choosing, problem solvingand so on. For example, if x-ot and t~ is a complexaction, also the meta-actions of searching in some planlibrary and selecting a specific decomposition for ct, aredelegated. If tile delegation is about a result x--g, also tilemeta-action of choosing among possible plans for g isdelegated. Both cases belong to open delegation: from theclient’s point of view, file contractor h~s to decide about filespecific actions to carry out. On the contrary an executivedelegation does not foresee any decision about thedelegation object.Again, another possible delegation regards the control onthe actions themselves.

In short, one can distinguish mnong at least the followingtypes of delegation:-pure executive delegation Vs open delegation;- delegation Vs non delegation of the control over theaction;- domain task delegation Vs planning task delegation( meta-actions ) - delegation to petfolvn Vs delegation to delegate.

Other levels of delegation will be analyzed later in relationwith tile levels of help, entitlement, and the formalizationof actions and plans.

Levels of adoption1) simple help: tile contractor adopts exactly what has beendelegated by the client (simple or complex action, etc.);2) overhelp: the contractor goes beyond what has beendelegated by tile client without ch,’mging tile client’s plan.3) critical h~lp: the contractor fulfils tile results of tilerequested plan/action, but modifies that plan.4) hyper-critical help: the contractor adopts goals orinterests of tile client that the client himself did not takeinto account: by doing so, the contractor does not realizethe action/plan nor the results that were delegated.

The contractor can modify the delegated task for severalreasons:- Impossibility: x cannot be done or its preconditions donot hold;. Inability: tile contractor is not able to do x;-Inappropriateness: unlike the client, the contractorbelieves that x is not useful for tile goal; tile contractorthinks that tile intended task cannot produce the expectedresults (somethnes, the client’s plan is even self-defeating);- Optimization: according to the contractor there is a betterway to achieve tile client’s goals and/or interests;- Conflict: tile contractor thinks that x could dmnage othergoals or interests of the client: those that the client did nottake into account in his planning;-Personal Preference: the contractor subordinates hisadoption to his own preferences or interests which mc in

conflict with tile tasks. In this work we will not considerthis case: we just consider fully cooperative cases; if thecontractor changes or refuses the task, this is just for thewellbeing of the client.

Contractor’s critiques of the delegation (request) can aimed to safeguard various goods of the client:a) the expected result of the requested action;b) some higher goal of that action in that plan;c) some other active goal of the client;d) some client’s goal, the client itself did not consider;e) a client’s interest;39 a goal of the role the client is holding;g) a goal~interest of a third agent the client is representing;h) sonue goal of the organization the client is acting in~for.

Although (f), (g), (h) are very important points for CSCW and organizations, we will not analyze them here.Let’s just put this question: when a software agent B helpsanother software agent A which acts on the behalf of a user,should it care only of A’s request or also of user’s interests?

Plan OntologyIn tiffs section we will introduce a formal representation ofagents and actions, and in particular a theory of therelationships between actions and results, which will be thebasis for a more precise analysis of delegation and adoptionlevels¯ As in [3,4] we consider the mapping from Kautz’plan hierarchies [5] to context-free grammars.

Basic NotionsLet Act= { 0q ..... ~} be a finite set of actions, let Agt= { A~,.., A~, B, C .... } a finite set of agents. Each agent has anaction repertoire, a plan library, resources, goals, beliefs,lnotivations, interests.The general plan library is FI = FP U FId, where U~ is therule set con’esponding with tile abstraction hierarchy (is-arelation) and FId is the rule set corresponding with thedecomposition hierarchy (part-qfrelation).As usual for each action there are: body, preconditions,constraints, results¯We will call a a composed action (plan) in YI if there is inYId a rule: a --> a~ ..... t~,. Tile actions ot~ ..... ~n arecalled comlxment actions of ct.We will call a au abstract action (plan) in rI if there is YP a rule: ~ --> {tj. a~ is called a specialized action of a.An action ct’ is called elementary action in FI if:1) there are no rules in a l ike: ~’ - -> aj . .... oq and2) there are no rules in I’P like: a’ --> We will call BAct (Basic Actions), the finite set elementary actions in FI; BActCAct.We will call CAct (Complex Actions) the set of actions Act which do not belong to BAct: CAct = Act - BAct;Given ~l, a2 mid I-I, we will say that £t~, dominates a, (orct, is dominated bY a,) if there is a set ot rules (r~ .... r~) in

such that: (oq = Lr~)A(Ot2~ Rrm )^(Lrl~ Rri.l)where" Lr. and Rr, are, respectively, the left part and the¯ " j ~ ,

right part of tile rule r~ and 2<~<m.We will say that ct~ dominates at level k a, if the set (r~,.., rm) includes k rules.We will call Act^, the set of actions known by A.ActAC_..Act. The set of tile not reducible actions (throughdecomposition or specification) included in a (the A’splan library) is composed of two subsets: tile set of actionsthat A conceives as elementary actions (BAct^) and the setof actions that A conceives as plans but for which he hasnot reduction rules (NRActA: Non Reduced actions). ThenBAct^CAct but it is possible that BActACBAct. In fact,given an elementary action, an agent knows (or not) thebody of that action. We will call skill set of an agent A,

Page 3: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

SA, the actions in BActA whose body is known by A(action repertoire of A). SA.~CBACtA. USAi (on all AiE C BAct, then it is possible that for some action o~e BAct,there is no agent, AjE Agt such that ae SA,.Act=UACtAi (on all ~e AgO: for each actibn a in Act, thereis at least one agent ~e Agt such that: Ai knows about a.

To execute an action ot means:- to execute the body of or, if tz is an elementary action;- to execute the body of each elementary action to which acan be reduced (through the rules in FI), if a is not elementary action;If an agent A does not know how to execute an elementaryaction t~ then it does not know how to execute any plan a’in HA with: i) (dominate a’ a) ii) all reduction rules of ct’ ill A containing ait self.A simplification factor is: given any two agents A and Band any elementary action a, with aeSAand a~Ss: thebody of ot is the same for A and B: an elementary actioncannot be performed in different ways.

From the previous assertions follows that an action tx canbe an elementary action for a given agent A and a plan foranother agent B. This is true when:((~E BActg)^(aE CActa))v((ae BACtA)^(txe NRActB))v((a~ SA)^(C~ NRActQ)Again, the same plan a could have, for different agents,different reduction rules.

Agents execute actions to achieve goals: they look up intheir memory the actions fitting the goals, select andexecute them.We will say that an agent A has complete executable know-how of an action ~ if either o~ S~ or in rI A there are a setof rules (r~ .... rm) able to transform tx in t~ .... otk withoq~ SA for each l<i_<k. The operator CEK(A,a) returns (r~,.., r~), then CEK(A,a)~0 (0 is the empty set) when complete executable know-how of a.It is possible that CEK(A,a) returns several sets of rules,one for each different way to reduce c~ in elementmy actionsof A (all included in SA).

Fixed some world states c, we will call R(tx,c) the operatorthat, when applied to an action a and to c, returns the setof the .r¢svlls produced by a (when executed alone). will assume that changing the world states in which anaction is applied, its results will change and changes thename of the action itself. Then R(cz,c) may be denoted withR(a) because c are defined in a univcxlue way.We will call c the conditions of an action and they representthe preconditions plus the consu’aints of the action itself.An action ~ can be executed if preconditions andconstraints of o~ are satisfied.We can distinguish two kinds of conditions: Executionconditions Ec(a) and Success conditions Sc(ct). If former are satisfied then o~ can be executed; if the latter aresatisfied then a, when executed will succeed.We will call Pc the operator that when applied to anyaction a returns the set of the preconditions of o~ (eitherce Ep(a)=the set of precondition included in Ec(a) c~ Sp(a)=the set of preconditions included in Sc(o0):Pc(t~)=Ep(ct)USp(c0.Given a multi-agent world, we will call constraint acondition c of ot (either ce Ec(o0 or ce Sc(cL)) such that agent has in its action repertory an a’ with cCR(a’).Cn(tx) is the set of constraints of or. If 3clce Cn(ct)A(c=false) then o~ is not executable with success.We will call P(ct) the conditions of c~.P(a)=Pc(a)UCn(c~).

An action a realizes g when the world states characterizingg are a subset of the a results: g..~R(a).

Theory of action resultsWe will call R^(a), the results that A believes a willproduce when executed. In our model RA(a) might (or correspond with R(~). When an action has been executedeach agent in Agt has the same perception of its results:exactly R(t~). This semplification produces a transparentworld - a unique point of view - regard to the action resultsfrom all the agents.We will assume that for each action a (with (ae Act^) (t~e ActQ) it is true this default belief:(Bel A (RA(a)=RB(a)=R(a)))^(Bel B (RB(tx)=R^(a)=R(a)))We will call relevant resul|,s of an action for a goal (set ofworld states), the subpart of the results of that action whichcorrespond with the goal; more formally, given a and g,we define the operator Rr such that:Rr(a,g)={gi I gie g} if g..~.R(o0, =0 otherwise.Then, the same action used for different goals has differentrelevant results.The agents "memorize" one (or more)goal for which they make use of the actions. It is possiblethat different agents associate different customary goals tothe same actions. We introduce the operator Ga (Goalassociation), such that Ga(A,tx) returns the customary goalg associated by A to a: Ga(A,a)=g;When an agent (A) uses an action (o0 to achieve a subpartof the results (g’) of that action which are not included the action customary goal (g), then the agent make imorooer use of the action.g CR(a), g CGa(A,a)=g

We will assume that to each plan it is associated a goal(plan goal). This goal is the goal that the plan constructorhas frozen in the plan structure itself.In a plan, then, the goal associated to the plan correspondswith the relevant results of that plan towards that goal:Ga(A,tx)=Rr(a,g) for each Ae

Let us suppose that a is a component (or specialized)action of or’ for g (Rr(t~’,g)~0); we define pertinent resultsof t~ in a’ for g, Pr(a,a’,g), the results of a useful for thatph’m o~’ towards the goal g; they correspond with a subsetof R(t~) such that:1) if a is a component action of o~’:Pr(a,a’,g) = {ril(rie R(a)) ̂ ((rte Rr(a’,g)) v ((ri=P(a~)) (dominate-level-1 o~’ a~))) 2) if tx is a specialized action of a’:Pr(u,a’,g) = {ril(r~e R(a)) ̂ (3r~l (qe Rr(ct’,g)) specialization of rj)};Let us define temporary results of an action a in a plan 0t’,the results of a that are not results of et’: Tr(a,a’) = {r~I(r~e R(o0) ̂ (ri~ R(a’)))}.

We define transitory results (or pertinent temporary results)of an action t~ in a plan a’ towards the goal g:TRr(a,a’,g) = Tr(t~,0t’) ^ Pr(a,o~’,g)they correspond with those results of tx that enable anotheraction a~ in ct’ but that are not results of ~’ towards thegoal g:TRr(ct,a’,g) = i I (l ie R(t~)) ^ (r i~ R(a’)) ^ (r i=P(a,)) ^(dominate-level-1 a’ t~)};

Let us define relevant results of a in ct’ towards g:lh’(c~,tx’ ,g)= { r~l(r~ R(o0)^(ri~ Rr(a’,g)) We ,also can write: Rr(a,oC,g)=Pr(o~,a’,g)-TRr(a,a’,g).where TRr(o~,a’,g) C.~Tr(ot, a’).

Page 4: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

The pertinent results of an action o: in t~’ represent the realreason for which ot is in or’.

We will call two plans:- synonymous a¢lion~: when they have the same reductionrules and the same conditions (then they necessary have thesame results);- equifinal actions: when they have different reduction rulesbut they have the same results.

If a plan a’ is used in an improper way by the aggnt A, forexample to achieve the goal gl instead g, where:g~={~l(gie R(0t’))A(g~ ~ Rr(ot’,g)) then the pertinent results of the component (or specialized)actions in or’ are not relevant results towards gl. It followsthat, if an agent uses a plan in an improper way, someactions included in the plan (in the case of compositionrule) are superfluous actions: i.e. their results are all andonly pertinent results.

Another point is that the results associated by an agent Ato an "isolated" elementary action oq, Ga(A,a~), could notcorrespond with the pertinent results of a~ in or’ towards g,that is to say: Ga(A,oq)~Pr(a~,a’,g);in some cases could be that Ga(A,~)~Pr(oq,a’,g).

We will say that there is a goal/subgoal (structuraDrelationship between two goals gi ,’rod gj if there are twoactions oq %, such that: gi C.Rp(a~,%,gj).The goal/subgoal (deep) relationship between goalsexpresses the deep reasons because it is possible to reducean action in a particular way. An agent able to individuatethese reasons is an agent that has a causal view of theactions [6].There are four kinds of relationships; given two goals gk gj,we have:- g~ implies gj;- gk causes gj;

3gI I (gi and gk) cause gJ;o3ore Act I gk Enable ~,j that is to say (gk = P(ot))

(gj~ R((x)));Control ActionFor each action a, it is possible to plan another action - ameta-action that we will call Control - such that its resultis to verify: a) that cx has been executed; b) that the resultsof a correspond with the expected results; c) how ct hasbeen executed (this applies only if a is a complex action).Given an action cx, the conu’ol action over ~ is an action ofan agent A aimed to match the expected results of ot withits actual results:(Control A ot)=(Verification by A that (RA(a) = R(ct))).In fact, to verify (RA(ct) = R(ot)) permits to satisfy: point (a) because a necessary result of ot is that its body hasbeen executed; obviously the point (b); and the point (c)because controlling how an action has been executedcorresponds to controlling points (a) and (b) relative to component actions.

Motivations and Interests of the AgentsEach agent A has a set of not instrumental goals: we willcall ~ these goals and GA the set containingthem.Each time an agent is pursuing a plan through the rules inits plan library (see [6] to distinguish between to pursueand to know a plan), the root of the tree in the plan is inGA, i.e. the plan is motivated.It is possible that a motivation g (g~ GA) is not a root any tree producible tltrough the rules in YIA (in other words,not always for a motivation the agent knows a plan toachieve that motivation). Viceversa, it is possible that

some of the root of the tree producible through the rules inHA are not included in G^ (in other words, the plans knownby an agent not necessarily are motivated for that agent).We will call active plans of an agent, the plans in itsworking memory: those plans that the agent is consideringto execute or to decide if they should be executed.

When A uses an action ot towards a goal g, it wants therelevant results of that action for that goal: Rr(ogg).

We will call interests of an agent A [7], I^, the set of worldstates that, if true, allow the achievement either of somemotivation or of the results of some action A is using. Wecan say that if A believes iela, then A wants i. Theinterests of A are in part motivations of A (the interestswhose A is aware) and in part potential motivations of (the interests whose A is unaware).

DelegationDelegation is a "social action" [7], and also a meta-action,since its object is an action.We define the Delegation action with 4 parameters:(Delegates A B x d), where A,B~ Agt, %=(a,g), d=deadline;This means that A delegates to B the task % with thedeadline d. In the following we will put aside both thedeadline of x, and the fact that in delegating x A (veryoften) implicitly delegates also the realization of ctpreconditions (that normally implies some problem-solvingand/or planning).

Kinds of delegation in relation to the taskDepending from the representation of x in A’s knowledge,we can characterize various types of delegation:- pure executive delegation: when either ct~ S^. orota BActA, or g is the relevant result of a (and ore S^ orore BActA).- open delegation: either ot~ CAct^, or a~NRAct^; and alsowhen g is the relevant result of ot (and ore CActA orme NRActA).When A delegates not only x but also the rules thatdecompose ot in actions, we have delegation-with-rules:(Delegates A B (x; r, ..... rm)) where r~ ..... rm are thereduction rules of ot in otj ..... Otn, and r~e FI^ for eachl<i<m. There are two subcases: one executive when foreach % with l___j___n, (aje SA)v(oqe BAct~); the other or)enwhere B tx, with l<j<n, such that (otje CAct^)v(%e NRActa). -" -

Notice that x can belong both to the domain or to the meta-domain of planning t,’tsks.

hnplicit aspects of delegation produce various possiblemisunderstandings among the agents. When an agent leaveshnplicit parts of its request (delegation), in fact she is alsodelegating the task qf reconstructing that implicit parts.

To delegate an action ot the client have to know at leastsome results of ct, since the request/expectation of anaction necessarily implies the expectation of at least one ofits results (see Postulate II).

Levels of adoption w!thin and beyonddelegation

Since Adoption can be unilateral and spontaneous, or canbe "critical", it can go beyond the request and the delegationof the client.In other words, B can adopt some of A’s goalsindependently of A’s delegation or request. This create aninteresting problem: what kind (or level) of goals can adopt beyond the possible request-delegation of A ?This problem is well known in conversation theory [8].For exmnple what in question-answering domain people

Page 5: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

call ’,gver-answering" is exactly a sub-case of this problem.By definition an "over-answering" is an answer to a certainrequest (of information) that goes beyond the requiredinformation (ex. "What time will the train from Romaarrive?" "It will arrive at 4 p.m. on rail 4"). Moreover, theadditional information is supposed to be useful for theclient, i.e. it is supposed to satisfy some other of hercurrent goals: the "over-answer" should be "cooperative" (inGricean sense) not irrelevant. Thus, the over-answeringcontractor should have a model of the plan and goals of theother agent.Exactly the same problem can be found in any kind of"cooperation" or help (goal adoption): also when A ask Bfor a practical action, B can "ovcr-an,~ver" .Going beyond the request opens different possibilities(depending on what kind of goal B is adopting): a theory needed about these levels of adoption that characterizedifferent kinds of "over-answering" [9] and different helpingrelations and roles among agents [7].In our view, nugdeling and reasoning about the other agentare not necessary only in case of belief or goal conflicts(like in our "critical" and "hyper-critical help"), but alsowhen there is full agreement and cooperation that goesbeyond the delegation (we call this case "overhelp").

Levels of adoption/helpWe identify several levels of help of B, starting from thegeneral condition: (Delegates A B "t=(c~,g))with dominating ~, where A delegates x within x’=(cx’,g’).

i) conservative help:ia) simple help: (Adopts B x);ib) subhelp: (Adopts B Xl) ^ (dominates a al); in otherwords, B does not satisfy the delegated task. Ex,’unple inconversation: A: "What time is it?", B: "I don’t know". A’ssubgoal that B answers, is satisfied, but the goal (to knowthe time) is not;ic) overhelp: (Adopts B -c~) ^ (dominates o~ co) (dominates-or-equal a’ at). Ex~unple in conversation: "What time is it?", B: "Be cahn, is 5pro and our meeting isat 6pro, we are in time". Both, the delegated action (toinform about time) and the higher, non-delegated results(plan) (to know whether we are late or not; to not anxious) are adopted and satisfied by the contractor.Practical example: A asks B to prepare the sauce for theravioli she will prepare for dinner, mad B prepares both thesauce and the ravioli.

ii) critical help: (Adopts B g); since (Adopts B g) it sufficient for B to find in Act~ an action oq whatever, suchthat gC_R~(o~). Critical help holds in the following cases:a) (CEK(B,a)--0)v(g~:R~(a))v(P(a)=false); that B either is not able to execute a or, on the basis of hisknowledge on action results, guesses that g is not mnongthe results of or, or the conditions of o~ ,’ue not true (and heis unable to realize them). Con’espondingly he must guessthat there is an aclion o~ , sucl~ as:(CEK(B,a,)~0)^(g.~R~(a,))^(P(tx,)=true); in other finds another way to realize g, using another action ~x, suchthat: B is able to realize it, the new action contains gamong its results and its conditions ,are satisfied.b) B thinks that the other results of a (beyond g) are conflict with other goals - in plan or off plan - or interestsof the client. On the other side, he thinks that there is anaction a, with: (CEK(B,ct,)~0) ^ (gC_R~(c~,)) ^ (P(a,)=true)and the results of oq are not in conflict with olher goals orinterests of the client.

c) There is again the case of optimization, where theconditions in (a) are all false but there is an action ~, suchthat g is reached in a more profitable way (relative to anycriterion).iii) critical overhelp: it is a mixed case in which thereare overhelp and critical help at the same time. Given(Adopts B g’):a) Pr(a,a’,g’)---0 and at the same time(3a,e Act, I Pr(a,,cx’,g’)~0 ̂ CEK(B,cx,)~0 ^ P(oq)=true). other words, there are not pertinent results of o~ in or’; but itexists at least an action ¢x, which is pertinent in or’ towardsg’. This means that o~ is unuseful for x’. It is even possiblethat it is noxious: i.e. that R(cx) produces results thatcontradict those intended with x’. A is delegating to B a planthat in B’s view is wrong or self-defeating.b) Pr(a,cx’,g’)~0 ^ CEK(B,o0~0 ^ P(o0=true and in addition(3 ¢x,e Act. I CEK(B,a)~0 ^ P(oq)=true ̂ Pr(Qt,,cx’,g’)~’0),moreover

bl) R(cx~) achieves the go,’ds internal to the plan (i.e. g’)in a better way (maximization).Example: A asks B "to buy second class train tickets forNaples" (action a) for her plan "to go to Naples cheaply’,’(action o~’). B adopts A’s goal "to go to Naples cheaply(goal g’) replacing the whole plan (o~’) with another plan:"go with Paul by car!".b2) R(oq) achieves not only the goals of the plan (i.e. but also other goals of A external to that plan (ex. g"),other motivations of A: (g’ C R(a,))^(g" C: R(oq)).Example: A asks B "to buy second class train tickets forNaples" (action a) for her plan "to go to Naples cheaply"(action ~x’). B adopts A’s goal "to go to Naples cheaply"(goal g’) replacing the whole plan (a’) with another (o0 "to go with Paul by car" that satisfies also anothermotivation of A - that she did not consider or satisfy inher plan - but B knows: "to travel with friends".b3) R(oq) achieves not only the goals of the plan but alsosome interests (i) of A: (g’CR(~,)) ̂ (iCR(a,)).Example: A asks B "to buy second class train tickets forNaples" (action ~) for her plan "to go to Naples cheaply"(action ¢x’). B adopts A’s goal "to go to Naples cheaply"(goal g’) replacing the whole plan (¢x’) with another (~,) "to go to Naples by bus" that satisfies an interest A of "not risking to meet Paul that she ignores to be onthe s,’une train".

iv) hypercritical help: (Adopts B g,) where g, is interest (or an off-plan goal) of A more important than g’ (weleave here this notion just intuitive). Since there is a conflictbetween the result R(~) (and/or the result R(a’)) and some of A, to adopt g, would imply to not obtain R(ot) (or R(ot’)).There are subcases:a) (B knows that there is a conflict between g, and g’) and believes that g, is better than g’ for A) and (A does not knowthat there is a conflict between g, g’), g, is an A’s off-plangoal;b) (B knows that there is a conflict between g, and g’) and believes that g, is better than g’ for A) and (A knows thatthere is a conflict between g, and g’) and (A believes that is better than g,); A and B have different evaluations aboutthe hnportance of g’ and g,;c) (B knows that there is a conflict between g, and g’) (B believes that g, is better than g’ for A); g, is an A’signored interest, then A ignores the conflict with g’.

Agent modelingIn this section we will analyse the main aspects of agentmodeling on the basis of the delegation-adoptionrelationship. In particular we examine which model will theclient have of file conlractor since (Delegates A B x), and,

Page 6: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

viceversa, which model will the contractor have of theclient since (Adopts B A x). In both cases, depending the believes about x, specific goals or competencies areattributed to the other agent.Such a modeling of the other agent is constructed ondifferent bases:a) previous experience of the other’s behaviour (if B oncewas able to do x, A assumes that B is able now; if i oncewas an interest of A by default it will be again so);b) an explicit declaration of the agent 03 declared to A to beable to do x; A explicimted to B the goals that motivate x);c) an implicit communication of the agent (B declared to that he intends to do x, that implies that he is able to do x;A asks B to do x, that implicitly informs B that A is notable to do x);d) attributions to the category or role the agent belongs to[10] (if B belongs to a class of agents that are able to do then also B is able to do x; if A belongs to a class ofagents that have the motivation g then also A has such agoal g).We will not examine how the agent models the"willingness" of file contractor [1 1 ].

A modeling x

x=ix; r, .... , rm are the reductionsrules for ix in ixt ..... ixn with(ix~ CACtA)^((0~E SA)v(oqe BActa)) x=g with(3ixe ACtA I (gCGa(A,ix))

l’c=g with(Vote ACtA I (not (gCGa(A,c~)))

ixE Sn

CEK(B,IX)go

(ixi~ SB)A(r~,.,rm)~ CEK(B,IX)

(ixe Sn)^(gCGa(B,ot))

It is possible to predict different needs of agent-modeling,from the two different structural positions in the contractrelationship:¯ Basically, the client should model abilities and reliabilityof the contractor; he will only exceptionally use plan andintention recognition.¯ Indeed, the contractor, in general does not need to modelclient’s capabilities, while (in deep cooperation) he needsmodeling quite well client’s plans and goals; thus he willapply PR and Intention Recognition.This asymmetry has, of course, some exception:- sometimes the contractor should model client’s abilities tobetter realize his help (consider overhelp due to thenecessity of realizing tasks that the client did not delegatebut is not really able to do, (example: ravioli); consider underhelp due to the fact that the client is able to do part ofx, and B leaves her to do i0;- sometimes for the client is useful to apply PR on thecontractor activity, for example for monitoring his taskexecution when she did not delegate also the control (we donot consider monitoring problem here [12]). PR is alsonecessary in weak delegation, where A must exploit B’sautonomous activity and intentions without any agreement.

A modelin~ Bix~ SB^ ((ix~ SB)v(ix~ SB))^(Delegates B C ix) (Delegates B C ix)CEK(B,IX)=0A ((CEK(B,IX)go)v03 reduces-and-execu~x (part-of (CEK03,IX)=0))Aix))A(Delegates B C (rest-of (Delegates B C ix)with ix=(pm~-of ix)U(rest-of (ixje SB)A((r~,.,rm)~ CEK(B,Ix))(with l<i<n)But there are other differentreductions in CEK(Brc0(ix~ CAct~)A(g c Ga(B,ix)ACEK03,IX’)=0A (Delegates B C ix’))with ix’=(part-of ix) non executablefrom B

(Delegates B C ix;r,,.,rm)

(ix~ CActB)A(gCGa03,ix)^CEK(B,ct)go)

(3ix~ SBI (3cq e CActB)A(gCGa(B,IX,)A (3ix~ CActB)A(gCG~l(B,cq)) CEK(B,cc’)=0A(Delegates B C ix’)) (gCGa(B,ixl)^

with ix’=(part-of ix~) CEK(B,IX,)go)non executable fi’om B

(Delegates B C g)(3ctl(ae SB)^(g CGa(B,ct))

(3al(ixe CActB)A(g CGa(B,cz)ACEK(B,c0=0A(Delegates B C to’))with ix’=(part-of ix) non executablefrom B

Table 1

(3ixl(ixe CActB)A(g CGa(B,ix)ACEK03,IX)gO)

Modeling the "client"For "simple adoption" the contractor does not needmodeling the client: he must just to understand the task(request). Indeed, as we said, deeper levels of cooperationrequire to go beyond the request. Thus modeling client’splans, goals, motivations, interests, is necessary. Above,the various adoption levels were illustrated; at each level itwas already apparent what aspects of the client thecontractor should model: if he has to ascribe her certaingoals or interests, and recognize her pl,’ms, in order to adoptthem.

A very important problem is the following: how can thecontractor recognize client’s plans in which delegation (~)is inserted? The ideal recognition of the active plan and

motivations of the client would be possible on the basis ofthe true plan library of the client herself. Lacking this, thecontractor works on his own plan library to recognize theplans of the other! In other words, the contractor practicallyand by default assumes that his plan library is shared.The contractor is normally supposed to be more expertabout the delegated task (to have a richer or more correctplan-library) than the client (how/why to delegateothe~vcise?) or to have more "local" mid specific knowledge.This is in general true for the plans dominated by x, not forplans which dominate x. It follows that in certain cases thecontractor could have expertise-problems in understandingthe client’s higher plans (for ex. soldier-generalrelationships); and, viceversa, that the client could havesome problems in monitoring the executive plans of a

6

Page 7: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

more expert or "local" contractor. In a complete theory ofdelegation types, one should distinguish between delegationof tasks in which the client is as expert as (or more expertthan) the contractor, and delegation of tasks to an expertagent.

Modeling the "contractor": trustClient’s model of the contractor has mainly to do with hiscompetence and willingness [11]. In particular, the modelof competencies is relative to domain competencies, metaor planning-competencies, control capabilities andcapabilities for sub-delegation; instead, the model of thewillingness is in relation with goals, commitments, andreliability.When A attributes to B the capability to sub-delegate to theagent C, A attributes (and delegates) to B also thecapability of modeling C.Given (Delegates A B x), A should believes about B that:a) when x=a:- either CEK(B,a)~’O - CEK(B,a)=0, but B can sub-delegate to an agent C thepart of the task he is unable to execute or decompose or- in both cases - CEK(B,a)¢0 or CEK(B,o0=0 - B can sub-delegate to an agent C;b) when x=g:- either there is in ActB an action ot such that gC R(a) [togo back case a) about a];- or there is not in ActB an action ot such that gCR(a) but(Delegates B C g) - in both cases - such an a exists or not - B can delegate gtoC.For a more detailed analysis see table 1. In this table foreach raw, the first column indicates the modeling of x byA, the following columns indicate the correspondingpossibilities of the modeling of B. Cases with x=(a,g) areomitted because their results are just the combination of thecases with x=a and x=g.

Competeneies and abilitiesAs we said a fundamental role is played by the modeling ofabilities and expertise of the other agent. In particular, inour approach it is critical the knowledge of the plan libraryof the other agent. In our model knowledge about actionsand rules (plans) can be distributed mnong the agent, in hierarchical way.

We mean that there are plans and/or actions that can beascribed to any agent (universal competence); others thatcan be ascribed to sub classes or categories of agents (likeroles in organizations) that have specialized expertise orskills (class competence); others that pertain only to certainagents (personal competence).

This is classical approach to the User Modeling [10] thatcan be applied to actions and plan libraries in agentmodeling.Notice that such a plan-ascription is fund~unelmd both formodeling the contractor’s competencies and for recognizingthe client’s plans.

Conflicts between delegation andadoption

In this section we do not consider the possible conflictsarising from the delegation of Control action.

Delegation and EntitlementPostulate: If A delegates B the task r, then A, implicitly orexplicitly, entitles B to r.

We will say that B is entitled by A to "t through thedelegation (Delegates A B x), when Iherc is common (to

and B) knowledge that A is committed to not oppose, tonot be astonished, etc., if B pursues x [2].

An agent B can be entitled beyond delegation.Let us define over-entitlement the case in which an agent A(or the role or the organization of B) entitles lhe agent B an overhelp, or to a critical help or to a critical overhelp or,at last, to an hypercritical help.

The over-entitlement can be:- explicit (explicit request from A to B to care of more thanthe delegated task);- implicit: in r (for example, the task delegated isobviously incomplete or inadequate); in role of B (thedelegation implies a completion in the role of B).

Discrepancy and ConflictThere is correspondence between delegation and adoptionwhen: (Delegates A B x)^(Adopts B A x) and the agents A and B have the same interpretation of x.When there is discrepancy (that is to say the abovecorrespondence is not true) there could be present delegationconflicts.

There is delegation conflict each time:- B interprets the delegation in different way from A’sinterpretation (misunderstanding) or- there is discrepancy and B adopt beyond the entitlement.However we are considering only "cooperative conflicts":conflicts starting from the cooperation between A and B;conflicts that B could produce because of his own goals areneglected.Also, conflicts that rise because A believes B noncooperative about the task, are neglected.

Correspondence between delegation and adoption is notalways the best help by B to A, neither, sometimes, thewanted help.In the great majority of cases a good help needs planningability and overhelp or critical help.

It is a general case that A delegates B to x but entitles Bbeyond x itself, and A has the expectation that B will do it.

Starting from the different cases of delegation, we willdescribe the main conflict classes:- conflict on the kind of task: the client thinks to delegate acertain kind of task (for example, pure executive task) onthe contrary she is delegating a different kind of task (forex,’unple an open task);- conflict on the possibility to sub-delegate x (or a subpartof x);- conflict on fl~e metalevel delegated competencies;- conflict on the contractor’s entitlenl¢lll: this conflict ispresent when the client thinks to have entitled thecontractor more (or less) than it is deducible fromdelegation or from explicit or implicit entitlement.

client’s pov correspondin~ contractor’s point of view"c=cx with 0t~ SBA(0te CActB) ot~ SBA(Ote NRActe)^(ae SA) (not 3CI

Conflict on the task (Delegates B C a)Error o[delegation

x=a with ae SB^(ae CActB) ~ Ss^(ot~ NRActQA(cte BActA) (not 3CI

Conflict on the task (Delegates B C or)Error of delegation

"c=ot with 0rE SB(aa NRActA) Conflict on the task

Table 2

Page 8: Levels of Help, Levels of Delegation and Agent Modeling · Levels of help, levels of delegation and agent modeling Cristiano Castelfranchi, Rind Falcone IP-CNR, Group of "Artificial

In table 2 are considered the conficts on the tasks (povmeans point of view). We will consider now the differentkinds of conflict on the basis of various cases of adoptionshown above.cl: P_~, this is the conflict rising in thesubhelp case where the client delegates a task and thecontractor adopts a subpart of the task.In the overhelp several possibilities of conflicts can rise;starting from (Delegates A B x) [intending that the delegateof x is in the context of a wider task x’, with (dominates czo~’): we will call this situation (Delegates A B "c (in x x’))]:

c2: There is conflict when(Adopt B A x~ (in x x~)) A (not-dominates x’ (dominates xl x) A (~’~l X’) A (CEK(B,0q)¢:0)either because of a misunderstanding (for exmnple, thecontractor supposes that in some interaction the clientcommunicated it information about x~, or in the client’smodel by the contractor it is supposed that g~ is a client’sgoal, that is to say gle G^, etc.) or because of ~lifference ofknowledge (for exmnple, in the plan libr,’u’y of the contractorthere is not the reduction rule from a’ in c~, etc.).In this case the "over-part" of the plan that B adopt could besuperfluous or quite deleterious.

Superfluous, if the over-part is not useful in ct’, but it doesnot lose any useful result of cz for ct’, more formally:Pr(cz, a’,g’)CR(cq).For example: A says B: "What time is it?" and B answersA: "Quiet, it is 17.00 o’clock, the conference will start at19.00 then we will arrive in due time". But, actually Aasked B the time because it had a telephone appointment.

Deleterious, when the over-part not only it is not useful inc~’, but it loses some useful result of cz for cz’, moreformally: Pr(ct,o~’,g’)~R((x~) and because Pr(0t,o~’,g’) Rr0x,ct’,g) U TRr((x,ct’,g) then 3r I (re Rr(ct,cz’,g)[it relevant result of o~ in cz’ towards g]) v (re TRa’(cz,(x’,g) is a transitory result of (x in cz’ tow,’u’ds g]) A r~ R(ct,).Then r is a temporary result of ct in cq: re Tr((x,cq).For example: A asks B to make-pesto; B (believing that wants to realize the plan make-spaghetti-pesto) makesspaghetti-pesto. Actually A wants make-tagliatelle-pesto.Another case is when in c2, also action ot is changed.

c3: In the overhelp even if there is not conflict like in

c2, it is possible an entitlement conflict.

c4: A special misunderstanding without conflict is: (AdoptB A xl (in x xl) A (in Xi X2)) A (not-dominates x’ (dominates x2 x~ x) ^ (dominates z’ "r~ x) A (CEK(B,cq)*0).In this case, the contractor (B) adopts an overplan of ct, (xi,that he considers useful for an extra overplan (z2 notconsidered by the client (A). In this operation B adopts theoverplan cq that is useful in the overplan ct’ too.

c5: In the case of critical help there is a conflict when RB(ct)not correspond with g; then when the contractor changes ctin (Zl this substitution is chosen starting from the results of(x, RB(ot), instead of g. B is finding an action cz~ in ActBsuch as: CEK(B,cz~),:0, with RB(a)CRB(fz~).

c6: In the hypercritical help there is conflict when the A’spreferences are different from the preferences A must haveaccording to B. Given gl~ GA, g2~ GA, then A prefers g, tog2, and B thinks that g2 must be preferred to g~ by A.

ConclusionsWe attempted to show that:i) There are several levels of cooperation - more or less"deep" mid helpful- mid several levels of lask delegation.

ii) These levels are related to the hierarchical structure ofplans or tasks.iii) There is a non-arbitrary correspondence between levelsof delegation and levels of adoption; we called "contract"this relation.iv) A "deep" cooperation needs understanding of plans,goals, and interests of the oilier agent or user.v) A task delegation needs a representation of practical andcognitive abilities (and resources) of the delegate; we calledthis representation: "trust".vi) There is a fundamental distinction between thedelegation/adoption of: a domain task (practical action), or planning or problem solving action or a control action.vii) There could be several conflicts due either to somemisunderstanding of the delegation, or to mismatchbetween delegation and help, and viceversa.

AcknowledgementsWe would like to thank Fiorella De Rosis for her preciousremarques.

References[1] Cohen, Ph. & Levesque, H., Intention is Choice withCommitment. Artificial Intelligence, 42(3), 1990.[2] Castelfranchi, C., Commitment: from intentions togroups and organizations. In Proceedings of ICMAS’96,S.Francisco, June 1996, AAAI-MIT Press[3] Falcone, R., Castelfranchi, C., CHAPLIN: A Chartbased Plan Recognizer, Proceedings of the ThirteenthInternational Conference of Avignon, Avignon, France, 24-28 May, 1993.[4] Vilain, M., Getting Serious about Parsing Plans: AGrammatical Analysis of Plan recognition. In Proc. of the1JCAI, 190-197, Boston, 1990.[5] Kautz, H. A. (1987). A Formal Theory of Planrecognition. PhD thesis, l~hfiversity of Rochester, 87.[6] Pollack, M., Plans as complex mental attitudes inCohen, P.R., Morgan, J. and Pollack, M.E. (eds), Intentionsin Communication, MIT press, USA, pp 77-103, 1990.[7] Conte,R. & Castelfranchi, C. Cognitive and SocialAction, UCL Press, London, 1995[8] Chu-Carroll J., Carberry, S., A Plan-Based Model forResponse Generation in Collaborative Task-OrientedDialogues in Proceeedings of AAAI-94. 1994.[9] Poggi, I., Castelfranchi, C., Parisi, D., Answers, repliesand reactions. In H. Parret. M. Sbisa’, J. Verschueren(eds.)Possibilities and Limitations of Pragmatics. Studies inLanguage Companion Series (Vol. 7), Amsterdam:J.Benjamins, 1981.[10] Rich, E. User Modeling via stereotypes. CognitiveSciences, 3:329-354, 1984.[11] Miceli, M., Cesta, A., Strategic Social PlanningLooking for Willingness in Multi-Agent Domains. InProceedings of the F~fleenth Annual Conference of theCognitive Science Society (pp. 741-746). 1993.[12] Castelfranchi, C., Falcone, R. (1995), "To say and do" virtual actions in the structure and recognition ofdiscourse plans with regard to practical plans, INTERACT’95, Lillehammer, Norway, 27-29 June, 1~)5.

8