Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International.

Similar presentations


Presentation on theme: "A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International."— Presentation transcript:

1 A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International

2 Overview CALO: a learning cognitive assistant User delegation of tasks to CALO Delegative BDI agent framework Goal adoption and commitments Summary and research issues

3 CALO: Cognitive Assistant that Learns and Organizes CALO supports a high-level knowledge worker Understands the “office world”, your projects and schedule Performs delegated tasks on your behalf Works with you to complete tasks Stays with you (and learns) over long periods of time Learns to anticipate and fulfill your needs Learns your preferred way of working Track execution of project tasks Help manage time and commitments Perform tasks in collaboration with the user

4 CALO Year 2

5 Overview CALO: a learning cognitive assistant User delegation of tasks to CALO Delegative BDI agent framework Goal adoption and commitments Summary and research issues

6 Delegation May Lead to Conflicts Focus on delegation of tasks from user to CALO Not on tasks to be performed in collaboration One aspect of CALO’s role as intelligent assistant CALO cannot act if conflicts over actions Conflicts in tasks “purchase this computer on my behalf” “register me for the Fall Symposium” Conflicts in guidance “always ask for permissions by email” “never use email for sensitive purchases”

7 Conflicts in User’s Desires “I wish to be thin” “I wish to eat chocolate” But Richard Waldinger’s scotch mocha brownies are full of calories  conflict between incompatible desires User’s desires conflict with each other Humans seem to have no problem with such conflicts CALO must recognize and respond appropriately

8 Other Types of Conflicts Current and new commitments Currently CALO is undertaking tasks to: Purchase an item of computer equipment Register user for a conference Now user tasks CALO to register for a second conference Set of new goals is logically consistent and coherent But infeasible because insufficient discretionary funds Commitments and advice User tasks CALO to schedule visitor’s seminar in best conference room Existing advice: “Never change a booking in the auditorium without consulting me” New goal and existing advice are inconsistent

9 The BDI Framework CALO’s ability to act is based on BDI framework Beliefs = informational attitudes about the world Desires = motivational attitudes on what to do Intentions = deliberative commitments to act Realized in the SPARK agent system Hierarchical, procedural reasoning framework BDI components in SPARK represented as: Facts (beliefs) Intentions (goals/intentions) Desires are not represented Procedures are plans to achieve intentions

10 Desires vs. Goals Both are motivational attitudes Desires may be neither coherent (with beliefs) nor consistent (with each other) Goals must be both Desires are ‘wishes’; goals are ‘wants’ “I wish to be thin and I wish to eat chocolate” “I want to have another of Richard’s brownies” Desires lead to goals CALO’s primary desire: satisfy its user Secondary desires→goals to do what user asks

11 ‘BDI’ Agents are Really ‘BGI’ Decision theory emphasizes B and D AI agent theory emphasizes B and I In most BDI literature, ‘Desires’ and ‘Goals’ are confounded In practice, focus is on: goal and then intention selection option generation, and plan execution and scheduling Focus has been much less on: deliberating over desires goal generation advisability vital for CALO

12 The Problem with BGI When Desires and Goals are unified into a single motivational attitude: Can’t support conflicting D/G (and D/B) Hard to express goal generation Hard to diagnose and resolve conflicts Between D/G and I, and between G, I, and plans Hard to handle conflicts in advice How can CALO make sense of the user’s taskings in order to act upon them? How can CALO recognize and respond to (potential) conflicts?

13 Overview CALO: a learning cognitive assistant User delegation of tasks to CALO Delegative BDI agent framework Goal adoption and commitments Summary and research issues

14 Cognitive Models for Delegation agent GAGA Belief B user (do assigned tasks) user B agent Desire Goal D user D agent G user G C agent + + + alignment delegation refinement decision making goal adoption Candidate Goals Adopted Goals satisfy all tasks

15 Delegative BDI Agent Architecture user failure conflicts revision advice AEAE AGAG agent GCGC GAGA I execute B sub-goaling B D G Candidate GoalsAdopted GoalsIntentions Goal AdviceExecution Advice

16 Overview CALO: a learning cognitive assistant User delegation of tasks to CALO Delegative BDI agent framework Goal adoption and commitments Summary and research issues

17 Requirements on Goal Adoption Self-consistency: G A must be mutually consistent Coherence: G A must be mutually consistent relative to the current beliefs B Feasibility: G A must be mutually satisfiable relative to current intentions I and available plans Includes resource feasibility Reasonableness: G A should be mutually ‘reasonable’ with respect to current B and I Common sense check: did you really mean to purchase a second laptop computer today?

18 Responding to Conflicting Desires Goal adoption process should admit: Adopting, suspending, or rejecting candidate goals Modifying adopted goals and/or intentions Modifying beliefs (by acting to change world state) Example: User desires to attend a conference in Europe but lacks sufficient discretionary funds shorten a previously scheduled trip cancel the planned purchase of a new laptop or apply for a travel grant from the department

19 Combined Commitment Deliberation Goal adoption Adopted Goals  Candidate Goals (  Desires) Intention reconsideration Extended agent life-cycle Non-adopted Candidate Goals Execution problems with Adopted Goals Propose combined commitment deliberation mechanism Based on agent’s deliberation over its mental states Bounded rationality: as far as the agent believes and can compute

20 BDI Control Cycle identify changes to mental state decide on response perform actions world state changes commitment deliberation

21 Mental State Transitions Current mental state S = (B,G C,G A,I) Omit D since suppose single “satisfy user” desire Outcome of deliberation is new state S' Possible new transitions: Expansionadopt additional goal No modification to existing goals or intentions Revocationdrop adopted goal + intention To enable a different goal in the future Proactivecreate new candidate goal and adopt it To enable a current candidate goal in the future Plus standard BGI transitions E.g. drop an intention due to plan failure observe decide act commitment deliberation

22 Goal and Intention Attributes Goals: User-specified value/utility Can be time-varying User-specified priority User-specified deadline Estimate cost to achieve Level of commitment so far For adopted goals Intentions: Implied value/utility Cost of change Deliberative effort Loss of utility Delay Level of commitment Level of effort so far E.g. estimated % complete Estimated cost to complete Estimated prob. success

23 Making the Best Decision S→S' transition as multi-criteria optimization Maximize (minimize) some combination of criteria over S Can be simple or complex Bounded rationality Simple default strategy, customizable by user Advice acts as constraints  constrained (soft) multi-criteria optimization problem “Don’t drop any intention > 70% complete” Assistive agent can consult user if no clear best S' “Should I give up on purchasing a laptop, in order to satisfy your decision to travel to both conferences?” Learn and refine model of user’s preferences

24 Example Candidate goals: c 1 : “Purchase a laptop” c 2 : “Attend AAAI” Adopted goals and intentions: g 1 with intention i 1 : “Purchase a high-end laptop using general funds” g 2 with intention i 2 : “Attend AAAI and its workshops, staying in conference hotel” New candidate goal from user: c 3 : “Attend AAMAS” (high priority) Mental state S = (B, {c 1,c 2,c 3 }, {g 1,g 2 }, {i 1,i 2 })

25 Example (cont.) CALO finds cannot adopt c 3 {g 1,g 2,g 3 } resource contention – insufficient general funds Options include: 1. Do not adopt c 3 (don’t attend AAMAS) 2. Drop c 1 or c 2 (laptop purchase or AAAI attendance) 3. Modify g 2 to attend only the main AAAI conference But changing i 2 incurs a financial penalty 4. Adopt a new candidate goal c 4 to apply for a departmental travel grant Advice prohibits option 2

26 Example (cont.) CALO builds optimization problem and solves it Problem constructed and solution method employed both depend on agent’s nature E.g. ignore % of intention completed No more than 10ms to solve Finds best is tie between options 3 and 4 Agent’s strategy (based on user guidance) is to consult user over which to do User instructs CALO to do both options New mental state S' = (B', {c 1,c 2,c 3,c 4 }, {g 1,g' 2,g 3,g 4 }, {i 1,i' 2 })

27 Overview CALO: a learning cognitive assistant User delegation of tasks to CALO Delegative BDI agent framework Goal adoption and commitments Summary and research issues

28 Summary CALO acts as user’s intelligent assistant Classical BDI framework inadequate Implemented BDI systems lack formal grounding Proposed delegative BDI agent framework Separate Desires and Goals Separate Candidate and Adopted Goals Incorporate user guidance and preferences Combined commitment deliberation for goal adoption and intention reconsideration Enables reasoning necessary for an agent such as CALO Implemented by extending SPARK agent framework

29 Related Work BOID framework [Broersen et al] Different types of agents based on B/D/G/I conflict resolution strategies BDGI CTL logic [Dastani et al] Merging desires into goals Intention reconsideration [Schut et al] Collaborative problem solving [Leveque and Cohen; Allen and Ferguson] Social norms and obligations [Dignum et al]

30 Future Work Extend goal reasoning to consider resource feasibility (in progress) Proactive goal anticipation and adoption Collaborative human-CALO problem solving Beyond (merely) completing user-delegated tasks Multi-CALO coordination and teamwork Learning as part of CALO’s extended life-cycle More information: http://calo.sri.com/


Download ppt "A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International."

Similar presentations


Ads by Google