CMPE539 AGENT COMMUNICTION

Slides:



Advertisements
Similar presentations
8-1 LECTURE 8: Agent Communication An Introduction to MultiAgent Systems
Advertisements

Innuendos and Common Knowledge. Consider the following examples of innuendos: "I hear you're the foreman of the jury in the Soprano trial. It's an important.
NON - zero sum games.
Nash’s Theorem Theorem (Nash, 1951): Every finite game (finite number of players, finite number of pure strategies) has at least one mixed-strategy Nash.
Mechanism Design without Money Lecture 1 Avinatan Hassidim.
This Segment: Computational game theory Lecture 1: Game representations, solution concepts and complexity Tuomas Sandholm Computer Science Department Carnegie.
Chapter Twenty-Eight Game Theory. u Game theory models strategic behavior by agents who understand that their actions affect the actions of other agents.
VARIATIONS ON SIMPLE PAYOFF MATRICES Topic #6. The Payoff Matrix Given any payoff matrix, the standard assumption is – that the players choose their strategies.
Simultaneous- Move Games with Mixed Strategies Zero-sum Games.
1 Chapter 14 – Game Theory 14.1 Nash Equilibrium 14.2 Repeated Prisoners’ Dilemma 14.3 Sequential-Move Games and Strategic Moves.
Chapter 6 Game Theory © 2006 Thomson Learning/South-Western.
Game Theory and Computer Networks: a useful combination? Christos Samaras, COMNET Group, DUTH.
Game Theory Game theory is an attempt to model the way decisions are made in competitive situations. It has obvious applications in economics. But it.
1. 2 APPENDIX B A PROCEDURE FOR GENERATING AN EQUILIBRIUM POINT FOR 2-PERSON GAMES (That sometimes works!)
Part 3: The Minimax Theorem
Lecture 2A Strategic form games
Game Theory Part 5: Nash’s Theorem.
Chapter 6 © 2006 Thomson Learning/South-Western Game Theory.
Adverse Selection Asymmetric information is feature of many markets
A camper awakens to the growl of a hungry bear and sees his friend putting on a pair of running shoes, “You can’t outrun a bear,” scoffs the camper. His.
Lecture 1 - Introduction 1.  Introduction to Game Theory  Basic Game Theory Examples  Strategic Games  More Game Theory Examples  Equilibrium  Mixed.
Chapter Twenty-Eight Game Theory. u Game theory models strategic behavior by agents who understand that their actions affect the actions of other agents.
Beyond Nash: Raising the Flag of Rebellion Yisrael Aumann University of Haifa, 11 Kislev 5771 ( )
Multiagent Systems and Societies of Agents
QR 38, 2/6/07 Overview of game theory I. Strategic interaction II. Game theory and international relations III. Deterrence.
APEC 8205: Applied Game Theory Fall 2007
Game Theory Here we study a method for thinking about oligopoly situations. As we consider some terminology, we will see the simultaneous move, one shot.
Presentation on Formalising Speech Acts (Course: Formal Logic)
Extensive Game with Imperfect Information Part I: Strategy and Nash equilibrium.
Complexity of Mechanism Design Vincent Conitzer and Tuomas Sandholm Carnegie Mellon University Computer Science Department.
DANSS Colloquium By Prof. Danny Dolev Presented by Rica Gonen
Strategic Game Theory for Managers. Explain What is the Game Theory Explain the Basic Elements of a Game Explain the Importance of Game Theory Explain.
Direct and indirect speech acts
Introduction to linguistics II
Pragmatics.
Computer Science 30/08/20151 Agent Communication BDI Communication CPSC /CPSC Rob Kremer Department of Computer Science University of Calgary.
PS429 Social and Public Communication PS429 Social and Public Communication Week 4 (25/10/2005) Reading group discussion.
6.3 Macropragmatics Speech act theory The cooperative principle The politeness principle.
© The McGraw-Hill Companies, 2006 Chapter 4 Implementing methods.
Standard and Extended Form Games A Lesson in Multiagent System Based on Jose Vidal’s book Fundamentals of Multiagent Systems Henry Hexmoor, SIUC.
EEL 5937 Agent communication EEL 5937 Multi Agent Systems Lecture 10, Feb. 6, 2003 Lotzi Bölöni.
Game Theory Part 2: Zero Sum Games. Zero Sum Games The following matrix defines a zero-sum game. Notice the sum of the payoffs to each player, at every.
EEL 5937 Models of agents based on intentional logic EEL 5937 Multi Agent Systems.
THE “CLASSIC” 2 x 2 SIMULTANEOUS CHOICE GAMES Topic #4.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Chapters 29, 30 Game Theory A good time to talk about game theory since we have actually seen some types of equilibria last time. Game theory is concerned.
Dominance Since Player I is maximizing her security level, she prefers “large” payoffs. If one row is smaller (element- wise) than another,
Lecture 7 Course Summary The tools of strategy provide guiding principles that that should help determine the extent and nature of your professional interactions.
The Science of Networks 6.1 Today’s topics Game Theory Normal-form games Dominating strategies Nash equilibria Acknowledgements Vincent Conitzer, Michael.
Extensive Games with Imperfect Information
Topic 3 Games in Extensive Form 1. A. Perfect Information Games in Extensive Form. 1 RaiseFold Raise (0,0) (-1,1) Raise (1,-1) (-1,1)(2,-2) 2.
Your host E. Aminudin Aziz. Austin’s observation on (many or even most) acts realised through speech  People do things with words  The idea sharply.
Manipulating the Quota in Weighted Voting Games (M. Zuckerman, P. Faliszewski, Y. Bachrach, and E. Elkind) ‏ Presented by: Sen Li Software Technologies.
1 What is Game Theory About? r Analysis of situations where conflict of interests is present r Goal is to prescribe how conflicts can be resolved 2 2 r.
EEL 5937 Agent communication EEL 5937 Multi Agent Systems Lotzi Bölöni.
Randolph Clarke Florida State University. Free will – or freedom of the will – is often taken to be a power of some kind.
Multiplication of Common Fractions © Math As A Second Language All Rights Reserved next #6 Taking the Fear out of Math 1 3 ×1 3 Applying.
Lec 23 Chapter 28 Game Theory.
Cheap Talk. When can cheap talk be believed? We have discussed costly signaling models like educational signaling. In these models, a signal of one’s.
Speech Act Theory Instructor: Dr Khader Khader.  Outline:  How Speech Act Theory began  What is the theory about  Levels of performing speech acts.
1 Fahiem Bacchus, University of Toronto CSC384: Intro to Artificial Intelligence Game Tree Search I ● Readings ■ Chapter 6.1, 6.2, 6.3, 6.6 ● Announcements:
© 2010 Pearson Prentice Hall. All rights reserved Chapter Hypothesis Tests Regarding a Parameter 10.
Q 2.1 Nash Equilibrium Ben
عمادة التعلم الإلكتروني والتعليم عن بعد
LECTURE 9: Agent Communication
SPEECH ACTS AND EVENTS 6.1 Speech Acts 6.2 IFIDS 6.3 Felicity Conditions 6.4 The Performative Hypothesis 6.5 Speech Act Classifications 6.6 Direct and.
THE ECONOMY: THE CORE PROJECT
Molly W. Dahl Georgetown University Econ 101 – Spring 2009
Lecture Game Theory.
Presentation transcript:

CMPE539 AGENT COMMUNICTION Dr. ADNAN ACAN

Communication Agents communicate; this is one of the defining characteristics of a multiagent system. The communication is taken to have a certain form (syntax), to carry a certain meaning (semantics), and to be influenced by various circumstances of the communication (pragmatics).

Communication “Doing by talking” I: cheap talk Consider the Prisoner’s Dilemma game: Suppose now that the prisoners are allowed to communicate before they play; will this change the outcome of the game?

Communication Intuitively, the answer is no. Regardless of the other agent’s action, the given agent’s best action is still D; the other agent’s talk is indeed cheap. Furthermore, regardless of his true intention, it is the interest of a given agent to get the other agent to play C; his talk is not only cheap, but also not credible. Now, let’s consider the coordination game:

Communication Here, if the row player declares “I will play U” prior to playing the game, the column player should take this seriously. Indeed, this utterance by the row player is both self-committing and self-revealing. A declaration of intent is self-committing if, once uttered, and assuming it is believed, the optimal course of action for the player is indeed to act as declared. For example, if the column player believes the utterance “I will play U,” then his best response is to play L. But then the row player’s best response is indeed to play U.

Communication In contrast, an utterance is self-revealing if, assuming that it is uttered with the expectation that it will be believed, it is uttered only when indeed the intention was to act that way. In our example, a row player intending to play D will never announce the intention to play U, and so the utterance is self-revealing. Babbling equilibrium: every cheap talk game has a babbling; there is always an equilibrium in which one party sends a meaningless signal and the other party ignores it. An equilibrium that is not a babbling equilibrium is called a revealing equilibrium.

Communication It might seem that self-commitment and self-revelation are inseparable, but this is an artifact of the pure coordination nature of the game. In such games the utterance creates a so-called focal point, a signal on which the agents can coordinate their actions. Let’s consider «Stag and Hunt Game»: Player 2 Player 1

Communication Consider the message of Player 1 “I plan to hunt stag.” It is not self-revealing; Player 1 would like Player 2 to believe this, even if she does not actually plan to hunt stag. However, it is self-committing; if Player 1 were to think that Player 2 believes her, then Player 1 would actually prefer to hunt stag. There is however the question of whether Player 2 would believe the utterance, knowing that it is not self-revealing on the part of Player 1.

Communication Consider the following payoff matrix of the «Stag and Hunt» game: If x is less than 7, then the message “I plan to hunt stag” is possibly credible. However, if x is greater than 7, then that message is not at all credible, because it is in Player 1’s best interest to get Player 2 to hunt stag, no matter what Player 1 actually intends to play.

Communication II. “Talking by doing”: signaling games Definition (Signaling game) A signaling game is a two-player game in which Nature selects a game to be played according to a commonly known distribution, player 1 is informed of that choice and chooses an action, and player 2 then chooses an action without knowing Nature’s choice, but knowing player 1’s choice. Consider the following signaling game. Nature chooses with equal probability one of the two zero-sum normal-form games:

Communication II. “Talking by doing”: signaling games Recall that player 1 knows which game is being played, and will choose his message first (U or D), and then player 2, who does not know which game is being played, will choose his action (L or R). What should player 1 do?

Communication II. “Talking by doing”: signaling games Note that in the leftmost game (U,R) is an equilibrium in dominant strategies, and in rightmost game (D,L) is an equilibrium in dominant strategies. Since player 2’s preferred action depends entirely on the game being played, and he is confident that player 1 will play his dominant strategy, his best response is R if player 1 chooses U, and L if player 1 chooses D. If player 2 plays in this fashion, we can calculate the expected payoff to player 1 as E(u1) = (0.5)*1 + (0.5)*2 = 1.5.

Communication II. “Talking by doing”: signaling games If player 1 always chooses D, regardless of what game he is playing, then his payoff is independent of player 2’s action. We calculate the expected payoff to player 1 as follows, assuming that player 2 plays L with probability p and R with probability (1 − p): E(u1) = (0.5)(3p + 0(1 − p)) + (0.5)(2p + 5(1 − p)) = 2.5. Thus, player 1 has a higher expected payoff if he always chooses the message D.

Communication II. “Talking by doing”: signaling games The example highlights an interesting property of signaling games. Although player 1 has privileged information, it may not always be to his advantage to exploit it. This is because by exploiting the advantage, he is effectively telling player 2 what game is being played and thereby losing his advantage. Thus, in some cases player 1 can receive a higher payoff by ignoring his information.

Communication II. “Doing by talking” II: speech-act theory The traditional view of communication is that it is the sharing of information. Speech-act theory, due to the philosopher J. L. Austin, embodies the insight that some communications can instead be viewed as actions, intended to achieve some goal. Speech-act theory distinguishes between three different kinds of speech acts:

Communication II. “Doing by talking” II: speech-act theory The locutionary act is merely the emission of a signal carrying a certain meaning. When I say “there’s a car coming your way,” the locution refers to the content transmitted. Locutions establish a proposition, which may be true or false. Illocutionary act, which in this case is a warning. In general, an illocution is the invocation of a conventional force on the receiver through the utterances. Other illocutions can be making a request, telling a joke, or, indeed, simply informing. If the illocution captures the intention of the speaker, the perlocutionary act is bringing about an effect on the hearer as a result of an utterance.

Communication II. “Doing by talking” II: speech-act theory Although the illocutionary and perlocutionary acts may seem similar, it is important to distinguish between an illocutionary act and its perlocutionary consequences. Illocutionary acts do something in saying something, while perlocutionary acts do something by saying something. Perlocutionary acts include scaring, convincing, and saddening.

Communication Agent communication languages Perhaps the most widespread use of speech act theory within the field of computer science is for communication between software applications. Increasingly, computer systems are structured in such a way that individual applications can act as agents (e.g., with the popularization of the Internet and electronic commerce), each with its own goals and planning mechanisms. In such a system, software applications must communicate with each other and with their human users to enlist the support of other agents to achieve goals, to commit to helping another agent, to report their own status, to request a status report from another, and so on.

Communication Several proposals have been made for artificial languages to serve as the medium for this inter-application communication. A relatively simple example is presented by Knowledge Query and Manipulation KQML Language (KQML), which was developed in the early 1990s. KQML incorporates some ideas from speech-act theory:

Communication KQML is no longer an influential standard, but the ideas of structured interactions among software agents that are based in part on speech acts live on in more modern protocols defined on top of abstract markup languages such as XML and the so-called Semantic Web. We have described how speech act theory can be used in communication between software applications. Some authors have also proposed to use it directly in the development of software applications, that is, as part of a programming language itself.

Communication This new programming paradigm has been rational termed rational programming. Just as object-oriented programming shifted the programming paradigm from writing procedures to creating objects, rational programming shifts the paradigm from creating informational objects to creating motivational agents. The motivational agents created by rational programming must act in the world, and because the agents are not likely to have a physical embodiment, their actions consist of sending and receiving signals; in other words, their actions will be speech acts. Of course, as shown in the previous section, it is possible to construct communicating agents within existing programming paradigms. However, by incorporating speech acts as primitives, rational programming constructs make such programs more powerful, easier to create, and more readable.

Communication Elephant2000 is a programming language described by McCarthy which explicitly incorporates speech acts. The following is an example statement from Elephant2000, taken from a hypothetical travel agency program: The intuitive reading of this statement is “if a passenger has requested to reserve a seat on a given flight, and that flight is not full, then make the reservation.”

Communication Agent-Oriented Programming (AOP) is a separate proposal that is similar to Elephant2000 in several respects. It too embraces speech acts as the form and meaning of the communication among agents. The most significant difference from Elephant2000 is that AOP also embraces the notion of mental state, consisting of beliefs and commitments. Thus the result of an inform speech act is a new belief. AOP is not actually a single language, but a general design that allows multiple languages; one particular simple language, Agent0 was defined and implemented.

Communication The following is an example statement in Agent0, taken from a print server application: The approximate reading of this statement is “if you get a request to free the printer for five minutes at a future time, if you are not committed to finishing a print job within ten minutes of that time, and if you believe the requester to be friendly, then accept the request and tell them that you did.”

Aggregating Preferences: Social Choice