Agent Communications BDI Definitions

Slides:



Advertisements
Similar presentations
Introduction A function is called higher-order if it takes a function as an argument or returns a function as a result. twice :: (a  a)  a  a twice.
Advertisements

For(int i = 1; i
Peer-to-peer and agent-based computing Agent-Based Computing: tools, languages and case studies (Cont’d)
Modelling uncertainty in 3APL Johan Kwisthout Master Thesis
1 Lab 4 Westfield High School APCS LAB 4 Parameters, apvectors, structs.
Copyright © 2014 Dr. James D. Palmer; This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Simple Example {i = 0} j := i * i {j < 100} Can we ‘verify’ this triple? Only if we know the semantics of assignment.
CS 106 Introduction to Computer Science I 12 / 04 / 2006 Instructor: Michael Eckmann.
Coalition Formation through Motivation and Trust Nathan Griffiths Michael Luck.
Computer Science 10/06/20151 iRobot Create Command Interface CPSC /CPSC Rob Kremer Department of Computer Science University of Calgary.
0 PROGRAMMING IN HASKELL Chapter 7 - Higher-Order Functions.
The Structure of the CPU
“Results” WG Deliberation John-Jules Meyer Emil Weydert.
4-1 Chapter 4: PRACTICAL REASONING An Introduction to MultiAgent Systems
BDI Agents Martin Beer, School of Computing & Management Sciences,
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
Design of Multi-Agent Systems Teacher Bart Verheij Student assistants Albert Hankel Elske van der Vaart Web site
On Social Commitment, Roles and Preferred Goals Paola A. Attadio.
Computer Science 30/08/20151 Agent Communication BDI Communication CPSC /CPSC Rob Kremer Department of Computer Science University of Calgary.
Software Agent - BDI architecture -. Outline BDI Agent AgentSpeak(L) Summary 1/39.
CPSC 252 Exception Handling Page 1 Exceptions and exception handling Client programmers can make errors using a class attempting to dequeue an item from.
DySy: Dynamic Symbolic Execution for Invariant Inference.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Belief Desire Intention Agents Presented by Justin Blount From Reasoning about Rational Agents By Michael Wooldridge.
EEL 5937 Models of agents based on intentional logic EEL 5937 Multi Agent Systems.
Computer Science 25/10/20151 Agent Communication BDI Logic CPSC /CPSC Rob Kremer Department of Computer Science University of Calgary Based.
Ayestarán SergioA Formal Model for CPS1 A Formal Model for Cooperative Problem Solving Based on: Formalizing the Cooperative Problem Solving Process [Michael.
© Cambridge University Press 2013 Thomson_alphaem.
© Cambridge University Press 2013 Thomson_Fig
Multi-Agent Systems University “Politehnica” of Bucarest Spring 2011 Adina Magda Florea curs.cs.pub.ro.
2005/07/27Commitment-based Conversations 1 Multi-Agent System Communication Paradigms Rob Kremer University of Calgary Department of Computer Science Calgary,
CS 106 Introduction to Computer Science I 04 / 25 / 2008 Instructor: Michael Eckmann.
Geoinformatics 2006 University of Texas at El Paso Evaluating BDI Agents to Integrate Resources Over Cyberinfrastructure Leonardo Salayandía The University.
Lecture 2 Multi-Agent Systems Lecture 2 University “Politehnica” of Bucarest Adina Magda Florea
Computer Science CPSC /CPSC Rob Kremer Department of Computer Science University of Calgary 07/12/20151 Agent Communications.
Perception: We perceive with our senses: smell, touch, sight.. What meaning do we make of these perceptions? We use “mental maps” which include our beliefs.
1 Static Contract Checking for Haskell Dana N. Xu University of Cambridge Joint work with Simon Peyton Jones Microsoft Research Cambridge Koen Claessen.
Computer Science 24/02/20161 Agent Communication FIPA Performatives CPSC /CPSC Rob Kremer Department of Computer Science University of Calgary.
AP Java Ch. 4 Review Question 1  Java methods can return only primitive types (int, double, boolean, etc).
© Cambridge University Press 2013 Thomson_Fig
 2004, G.Tecuci, Learning Agents Center CS 785 Fall 2004 Learning Agents Center and Computer Science Department George Mason University Gheorghe Tecuci.
2.4 Exceptions n Detects try { //code that may raise an exception and/or set some condition if (condition) throw exceptionName; //Freq. A string } n Handles.
Service-Oriented Computing: Semantics, Processes, Agents
Chapter 6 CS 3370 – C++ Functions.
© Cambridge University Press 2011
CMSC 691M Agent Architectures & Multi-Agent Systems
Artificial Intelligence Chapter 25 Agent Architectures
Learn To Fix Errors On Dell PC. We are a third-party service provider for Dell users in Nederland. Call us on Website:
CSE 143 Error Handling [Section 2.8] 3/30/98 CSE 143.
PROGRAMMING IN HASKELL
PROGRAMMING IN HASKELL
Service-Oriented Computing: Semantics, Processes, Agents
كيــف تكتـب خطـة بحـث سيئـة ؟؟
الدكتـور/ عبدالناصـر محمـد عبدالحميـد
Definitions Constraint blocks provide a mechanism for integrating engineering analysis with other SysML models. Constraint blocks can be used to specify.
Michael Wooldridge presented by Kim Sang Soon
Fail Fail Poor Communication Lack of Documentation Poor Execution.
Thomson_eeWWtgc © Cambridge University Press 2013.
Social Commitment Theory
Mathematics for Computer Science MIT 6.042J/18.062J
Thomson_atlascmsEventsAlt
Service-Oriented Computing: Semantics, Processes, Agents
Belief Desire Intention
Department of Computer Science
Thomson_CandP © Cambridge University Press 2013.
Artificial Intelligence Chapter 25 Agent Architectures
Formal Methods in DAI : Logic-Based Representation and Reasoning
PROGRAMMING IN HASKELL
COMP3710 Artificial Intelligence Thompson Rivers University
Thomson_AFBCartoon © Cambridge University Press 2013.
Presentation transcript:

Agent Communications BDI Definitions CPSC 601.68/CPSC 599.68 Rob Kremer Department of Computer Science University of Calgary Based on: Michael Wooldridge. Reasoning about Rational Agents. MIT Press, Cambridge, Mass. 2000. Chapter 2. 09/11/2018

CPSC 609.68/599.68: Agent Communications Plan Plan Preconditions: P Predicate Postconditions: P Predicate Actions: Seq Action hd: Plan  Action // returns head(Plan.Action) tail: Plan  Plan // returns {Pre,Post,tail(Actions)} 09/11/2018 CPSC 609.68/599.68: Agent Communications

CPSC 609.68/599.68: Agent Communications BDI Agent BDIagent B: P Bel D: P Des I: P Int getNextPercept:   Percept brf: P Bel  Percept  P Bel options: P Bel  P Int  P Des filter: P Bel  P Des  P Int  P Int plan: P Bel  P Int  Plan execute: Plan 09/11/2018 CPSC 609.68/599.68: Agent Communications

CPSC 609.68/599.68: Agent Communications A simple BDI agent B := B0; I := I0; /* B0 are initial beliefs; I0 are initial intentions */ while true do { p := getNextPercept(); B := brf(B,p); D := options(B,I); I := filter(B,D,I);  := plan(B,I); execute() } Stuck to a plan: If the plan fails (becomes unsound) then the agent really should change the plan. 09/11/2018 CPSC 609.68/599.68: Agent Communications

BDI Agent that reacts if plan is unsound B := B0; I := I0; /* B0 are initial beliefs; I0 are initial intentions */ while true do { p := getNextPercept(); B := brf(B,p); D := options(B,I); I := filter(B,D,I);  := plan(B,I); while not empty() do { a := hd(); execute(a);  := tail(); if not sound(,I,B) then } Deals with unsound plans, but won’t drop intentions: What if we have unexpected early success or conditions arise that make our intentions useless or unobtainable? We might want to change our intentions. 09/11/2018 CPSC 609.68/599.68: Agent Communications

BDI Agent that may drop intentions B := B0; I := I0; /* B0 are initial beliefs; I0 are initial intentions */ while true do { p := getNextPercept(); B := brf(B,p); D := options(B,I); I := filter(B,D,I);  := plan(B,I); while not (empty() or suceeded(I,B) or impossible(I,B)) do { a := hd(); execute(a);  := tail(); if not sound(,I,B) then } But if the environment changes, we may need to reconsider intensions. 09/11/2018 CPSC 609.68/599.68: Agent Communications

Cautious Agent that reconsiders after each action B := B0; I := I0; /* B0 are initial beliefs; I0 are initial intentions */ while true do { p := getNextIntention(); B := brf(B,p); D := options(B,I); I := filter(B,D,I);  := plan(B,I); while not (empty() or suceeded(I,B) or impossible(I,B)) do { a := hd(); execute(a);  := tail(); p := getNextPercept(); if not sound(,I,B) then } But this agent might spend almost all of it’s time considering intensions and no time actually doing useful stuff. 09/11/2018 CPSC 609.68/599.68: Agent Communications

BDI Agent that attempts to strike a balance B := B0; I := I0; /* B0 are initial beliefs; I0 are initial intentions */ while true do { p := getNextPercept(); B := brf(B,p); D := options(B,I); I := filter(B,D,I);  := plan(B,I); while not (empty() or suceeded(I,B) or impossible(I,B)) do { a := hd(); execute(a);  := tail(); if reconsider(I,B) then { I := filter(B,D,I); } if not sound(,I,B) then } This is only useful if reconsider() is a lot cheaper than options() and filter(). 09/11/2018 CPSC 609.68/599.68: Agent Communications