Presentation is loading. Please wait.

Presentation is loading. Please wait.

Do software agents know what they talk about?

Similar presentations


Presentation on theme: "Do software agents know what they talk about?"— Presentation transcript:

1 Do software agents know what they talk about?
Agents and Ontology dr. Patrick De Causmaecker, Nottingham, March 2005

2 Reactive and hybrid agents

3 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Reactive Symbolic representations and reactive manipulation do not work Intelligent behaviour is linked to the environment in which the agent resides Intelligent behaviour emerges from simple actions Nottingham, March 2005 Agents and Ontology

4 The “subsumption” architecture (Brooks)
No explicit representations No abstract reasoning Intelligence emerges from complexity. Nottingham, March 2005 Agents and Ontology

5 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
The architecture Two characteristics Decisions through execution of tasks (behaviours as finite state machines No symbolic representations or reasoning Situation -> action The different behaviours can fire simultanously Hierarchy in priority layers The see(…) is still present, but does no complex transformations on the sensorsignals Nottingham, March 2005 Agents and Ontology

6 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Action selection R: behaviour rules <: priority relation function action(p:P):A var fired : (R) var selected:A begin fired <-{(c,a)|(c,a) in R and p in c} for each (c,a) in fired do if there is no (c’,a’) < (c,a) in fired then return a end-if end-for return null End function action Nottingham, March 2005 Agents and Ontology

7 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Steels Nottingham, March 2005 Agents and Ontology

8 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Nottingham, March 2005 Agents and Ontology

9 Luc Steels: the Mars explorer
Find stones on Mars. We do not know where the precious stones are. They do come in heaps. We have a number of autonomous vehicles that can drive round and pick up stones to bring them to the mother ship. There are obstacles making communication virtually impossible. Nottingham, March 2005 Agents and Ontology

10 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Symbolic? Luc Steels says this is completely unrealistic His method: Apply a gradient field that indicates the direction of the mother ship (radio signal) Use radioactive crumbs that can be dropped, detected and picked up by the vehicles. Nottingham, March 2005 Agents and Ontology

11 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
The agents Rules: (1) if detect an obstacle then change direction (2) if carrying samples and at the base then drop samples (3) if carrying samples and not at the base then travel up gradient (4) if detect a sample then pick sample up (5) if true then move randomly levels: (1)<(2)<(3)<(4)<(5) Nottingham, March 2005 Agents and Ontology

12 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Co-operation rule 3 changes to 6: (6) if carrying samples and not at the base then drop two crumbs and travel up gradient rule 8 is added: (8) if sense crumbs then pick up 1 crumb and travel down gradient priority: (1)<(2)<(6)<(4)<(8)<(5) Nottingham, March 2005 Agents and Ontology

13 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Similar development Patty Maes: Agent Network Architecture (ANA) Nottingham, March 2005 Agents and Ontology

14 Limitations of reactive agents
The local environment must be rich. How can non local information be modelled? The agents have a short term vision. How can the agents learn? How to design systems with emergent behaviour? What if complexity requires many layers? -> evolving agents, artificial life Nottingham, March 2005 Agents and Ontology

15 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
Hybrid agents layer n Input from perception Output action layer 2 layer 1 layer n layer n layer 2 layer 2 layer 1 layer 1 Nottingham, March 2005 Agents and Ontology

16 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
TouringMachines Sensor input Modelling layer Perception subsystem Action Subsystem Planning layer Reacive layer Action output Control subsystem Nottingham, March 2005 Agents and Ontology

17 Agents and Ontology Patrick.DeCausmaecker@kahosl.be
The TouringMachine Reactive layer: E.g. avoid obstacles Planning layer Proactive behaviour using a library of schemata for plans Modelling layer Represents object and other agents in the system. Predicts conflicts and postulated objectives for the planning layer. The control system decides which of the layers is active at a given moment. Nottingham, March 2005 Agents and Ontology


Download ppt "Do software agents know what they talk about?"

Similar presentations


Ads by Google