Commonsense Reasoning and Argumentation 14/15 HC 13: Dialogue Systems for Argumentation (1) Henry Prakken 25 March 2015.

Slides:



Advertisements
Similar presentations
On norms for the dynamics of argumentative interaction: argumentation as a game Henry Prakken Amsterdam January 18, 2010.
Advertisements

Argumentation Based on the material due to P. M. Dung, R.A. Kowalski et al.
Workpackage 2: Norms
Commonsense Reasoning and Argumentation 14/15 HC 8 Structured argumentation (1) Henry Prakken March 2, 2015.
Computational Models for Argumentation in MAS
Commonsense Reasoning and Argumentation 14/15 HC 9 Structured argumentation (2) Henry Prakken March 4, 2015.
On the structure of arguments, and what it means for dialogue Henry Prakken COMMA-08 Toulouse,
Commonsense Reasoning and Argumentation 14/15 HC 10: Structured argumentation (3) Henry Prakken 16 March 2015.
Legal Argumentation 2 Henry Prakken March 28, 2013.
Commonsense Reasoning and Argumentation 14/15 HC 15: Concluding remarks Henry Prakken 1 April 2015.
Deduction In addition to being able to represent facts, or real- world statements, as formulas, we want to be able to manipulate facts, e.g., derive new.
Argumentation Logics Lecture 1: Introduction Henry Prakken Chongqing May 26, 2010.
Argumentation-based negotiation Rahwan, Ramchurn, Jennings, McBurney, Parsons and Sonenberg, 2004 Presented by Jean-Paul Calbimonte.
Some problems with modelling preferences in abstract argumentation Henry Prakken Luxemburg 2 April 2012.
Argumentation in Artificial Intelligence Henry Prakken Lissabon, Portugal December 11, 2009.
| 1 › Floris Bex / Centre for Law and ICT › Henry Prakken / Centre for Law and ICT Dept. of ICS, Utrecht University Investigating stories in.
Reasoning with testimony Argumentation vs. Explanatory Coherence Floris Bex - University of Groningen Henry Prakken - University of Groningen - Utrecht.
CPSC 322, Lecture 20Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Computer Science cpsc322, Lecture 20 (Textbook.
Models -1 Scientists often describe what they do as constructing models. Understanding scientific reasoning requires understanding something about models.
Argumentation Logics Lecture 4: Games for abstract argumentation Henry Prakken Chongqing June 1, 2010.
Argumentation Henry Prakken SIKS Basic Course Learning and Reasoning May 26 th, 2009.
Information, action and negotiation in dialogue systems Staffan Larsson Kings College, Jan 2001.
Argumentation Logics Lecture 7: Argumentation with structured arguments (3) Henry Prakken Chongqing June 4, 2010.
Argumentation Logics Lecture 6: Argumentation with structured arguments (2) Attack, defeat, preferences Henry Prakken Chongqing June 3, 2010.
Argumentation Logics Lecture 4: Games for abstract argumentation Henry Prakken Chongqing June 1, 2010.
Argumentation Logics Lecture 1: Introduction Henry Prakken Chongqing May 26, 2010.
Argumentation in Agent Systems Part 2: Dialogue Henry Prakken EASSS
Persuasive Speaking Chapter 14.
Accounting Information “Knowledge is Power” Sir Francis Bacon.
Argumentation Logics Lecture 5: Argumentation with structured arguments (1) argument structure Henry Prakken Chongqing June 2, 2010.
Henry Prakken August 23, 2013 NorMas 2013 Argumentation about Norms.
Research Papers. Critical Thinking Observations: From a series of observations we can establish facts. You have all experienced some sort of interactive.
People, Process, Technology. Communication Quality Experience(QCE ) What does this mean Why is it important How does it effect us Why do we need it How.
Chapter 4: Lecture Notes
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt WORD S.
Chapter 6: Objections to the Physical Symbol System Hypothesis.
Introduction to formal models of argumentation
Argumentation and Trust: Issues and New Challenges Jamal Bentahar Concordia University (Montreal, Canada) University of Namur, Belgium, June 26, 2007.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 19, 2012.
1 Ethics of Computing MONT 113G, Spring 2012 Session 18 Ethical reasoning.
Knowledge Representation Use of logic. Artificial agents need Knowledge and reasoning power Can combine GK with current percepts Build up KB incrementally.
Formal Models in AGI Research Pei Wang Temple University Philadelphia, USA.
PERSUASION. “Everybody Hates Chris”
PERSUASION.
Communication  Process of creating meaning though symbolic interaction  Process of sending/receiving messages Verbal Nonverbal Characteristics of Communication.
Arguing Agents in a Multi- Agent System for Regulated Information Exchange Pieter Dijkstra.
Commonsense Reasoning and Argumentation 14/15 HC 14: Dialogue systems for argumentation (2) Henry Prakken 30 March 2015.
Chapter Study Guide GROUP COMMUNICATION. Chapter What are the 4 steps in the problem solving process? Describe and understand the problem.
Modeling Speech Acts and Joint Intentions in Modal Markov Logic Henry Kautz University of Washington.
A Quantitative Trust Model for Negotiating Agents A Quantitative Trust Model for Negotiating Agents Jamal Bentahar, John Jules Ch. Meyer Concordia University.
Negotiation Skills Mike Phillips Training Quality Manager
One Form of Argument… “Argument” in NGSS In science, the production of knowledge is dependent on a process of reasoning from evidence that requires a.
Henry Prakken & Giovanni Sartor July 18, 2012 Law Logic Summerschool 2012 Session (Part 2): Burdens of proof and presumptions.
PERSUASION. Credibility: - Audience’s perception of how believable the speaker is - Factors of credibility: Competence- how the audience regards the intelligence,
SPEECH ACTS Saying as Doing See R. Nofsinger, Everyday Conversation, Sage, 1991.
It’s Good to Talk: Changing Classroom Talk. Aims of the Session: Moving from exploring talk to changing talk in the classroom Consolidating the thinking.
What is an “argument”? Anger: Fight or quarrel Debate: Pro and con Programming:  A parameter is a variable which takes on the meaning of a corresponding.
Argumentation Logics Lecture 2: Abstract argumentation grounded and stable semantics Henry Prakken Chongqing May 27, 2010.
© 2011 Cengage Learning Pitching Your Idea Presentation Skills for Designers.
Propositional Logic Russell and Norvig: Chapter 6 Chapter 7, Sections 7.1—7.4 CS121 – Winter 2003.
Henry Prakken & Giovanni Sartor July 16, 2012
Debate as a pedagogical tool
Artificial Intelligence
Essentials of Public Speaking
Developing Arguments for Persuasive Speeches
Introduction to Logic Lecture 1 What is Critical Reasoning?
Knowledge Representation I (Propositional Logic)
Henry Prakken Chongqing May 27, 2010
Presentation transcript:

Commonsense Reasoning and Argumentation 14/15 HC 13: Dialogue Systems for Argumentation (1) Henry Prakken 25 March 2015

Why do agents need argumentation? For their internal reasoning Reasoning about beliefs, goals, intentions etc often is defeasible For their interaction with other agents Information exchange involves explanation Collaboration and negotiation involve conflict of opinion and persuasion

Overview Dialogue systems for argumentation Inference vs. dialogue Use of argumentation in MAS General ideas Two systems (1)

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased

We should lower taxes claim

We should lower taxes claimwhy

We should lower taxes Lower taxes increase productivity Increased productivity is good since claimwhy

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad since claimwhy

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Increased inequality is good since claimwhy claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Increased inequality is good since claimwhy claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Increased inequality is good Increased inequality stimulates competition Competition is good since claimwhy claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good since claim why claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good since claim why claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why claim concede

We should lower taxes Lower taxes increase productivity Increased productivity is good We should not lower taxes Lower taxes increase inequality Increased inequality is bad Lower taxes do not increase productivity Prof. P says that … Prof. P has political ambitions People with political ambitions are not objective Prof. P is not objective Increased inequality is good Increased inequality stimulates competition Competition is good USA lowered taxes but productivity decreased since claim why retract claim concede

Types of dialogues (Walton & Krabbe) Dialogue TypeDialogue GoalInitial situation Persuasionresolution of conflictconflict of opinion Negotiationmaking a dealconflict of interest Deliberationreaching a decisionneed for action Information seekingexchange of informationpersonal ignorance Inquirygrowth of knowledgegeneral ignorance

Example P: I offer you this Peugeot for $ P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

Example (2) P: I offer you this Peugeot for $ P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

Example (3) P: I offer you this Peugeot for $ P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

Inference vs dialogue Dialogue systems for argumentation have: A communication language (well-formed utterances) A protocol (which utterances are allowed at which point?) Termination and outcome rules Argument games are a proof theory for a logic But real argumentation dialogues have real players! Distributed information Richer communication languages Dynamics

Standards for argumentation formalisms Logical argument games: soundness and completeness wrt some semantics of an argumentation logic Dialogue systems: effectiveness wrt dialogue goal and fairness wrt participants’ goals Argumentation: Dialogue goal = rational resolution of conflicts of opinion Participants’ goal = to persuade Argumentation is often instrumental to other dialogue types Does argumentation promote the goals of e.g. negotiation or deliberation?

Some properties of dialogue systems that can be studied Correspondence of outcome with players’ beliefs If the union of participants’ beliefs justifies p, can/will agreement on p result? (‘completeness’) If participants’ agree on p, does the union of their beliefs justify p? (‘soundness’) Disregarding vs. assuming participants’ personalities

Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t p  q s   q r   s r,t   p Knowledge basesInference rules P1: q since p

Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t Knowledge basesInference rules P1: q since p O1:  q since s p  q s   q r   s r,t   p

Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t, r Knowledge basesInference rules P1: q since p O1:  q since s P2:  s since r p  q s   q r   s r,t   p

Game for grounded semantics unsound in distributed settings Paul: p, r Olga: s,t, r Knowledge basesInference rules P1: q since p O1:  q since s O2:  p since r,t P2:  s since r p  q s   q r   s r,t   p

Example 1 Paul: r Olga: s p  q r  p s   r Knowledge basesInference rules P1: q since p Paul  Olga does not justify q but they could agree on q Olga is credulous: she concedes everything for which she cannot construct a (defensible or justified) counterargument

Example 1 Paul: r Olga: s p  q r  p s   r Knowledge basesInference rules P1: q since p Paul  Olga does not justify q but they could agree on q O1: concede p,q

Example 1 Paul: r Olga: s p  q r  p s   r Knowledge basesInference rules P1: q since p Paul  Olga does not justify q but they could agree on q Olga is sceptical: she challenges everything for which she cannot construct a (defensible or justified) argument

Example 1 Paul: r Olga: s Knowledge basesInference rules P1: q since p O1: why p? p  q r  p s   r Paul  Olga does not justify q but they could agree on q

Example 1 Paul: r Olga: s Knowledge basesInference rules P1: q since p O1: why p? P2: p since r p  q r  p s   r Paul  Olga does not justify q but they could agree on q

Example 1 Paul: r Olga: s Knowledge basesInference rules P1: q since p O1: why p? O2:  r since s P2: p since r p  q r  p s   r Paul  Olga does not justify q but they could agree on q

Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p Modus ponens … Paul  Olga does not justify p but they will agree on p if players are conservative, that is, if they stick to their beliefs if possible

Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: concede p Modus ponens … Paul  Olga does not justify p but they will agree on p if players are conservative, that is, if they stick to their beliefs if possible

Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: what about q? Modus ponens … Possible solution (for open-minded agents, who are prepared to critically test their beliefs):

Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: what about q? Modus ponens … P2: claim q Possible solution (for open-minded agents, who are prepared to critically test their beliefs):

Example 2 Paul: p q Olga: p q   p Knowledge basesInference rules P1: claim p O1: what about q? Modus ponens … P2: claim q O2:  p since q, q   p Possible solution (for open-minded agents, who are prepared to critically test their beliefs): Problem: how to ensure relevance?

Dialogue game systems in more detail A dialogue purpose Participants (with roles) A topic language Lt With a logic A communication language Lc With a protocol Move legality rules Effect rules for Lc (“commitment rules”) Turntaking rules Termination and outcome rules

Effect rules Specify commitments “Claim p” and “Concede p” commits to p “p since Q” commits to p and Q “Retract p” ends commitment to p... Commitments used for: Determining outcome Enforcing ‘dialogical consistency’...

Public semantics for dialogue protocols Public semantics: can protocol compliance be externally observed? Commitments are a participant’s publicly declared standpoints, so not the same as beliefs! Only commitments and dialogical behaviour should count for move legality: “Claim p is allowed only if you believe p” vs. “Claim p is allowed only if you are not committed to  p and have not challenged p”

More and less strict protocols Single-multi move: one or more moves per turn allowed Single-multi-reply: one or more replies to the same move allowed Deterministic: no choice from legal moves Deterministic in Lc: no choice from speech act types Only reply to moves from previous turn?

Two systems for persuasion dialogue Parsons, Wooldridge & Amgoud Journal of Logic and Computation 13(2003) Prakken Journal of Logic and Computation 15(2005)

PWA: languages, logic, agents Lc: Claim p, Why p, Concede p, Claim S p  Lt, S  Lt Lt: propositional Logic: argumentation logic Arguments: (S, p) such that S  Lt, consistent S propositionally implies p Defeat: (S, p) defeats (S’, p’) iff  p  S’ and level(S) ≥ level(S’) Semantics: grounded Assumptions on agents: Have a knowledge base KB  Lt Have an assertion and acceptance attitude

Assertion/Acceptance attitudes Relative to speaker’s own KB + hearer’s commitments Confident/Credulous agent: can assert/accept P iff she can construct an argument for P Careful/Cautious agent: can assert/accept P iff she can construct an argument for P and no stronger argument for -P Thoughtful/Skeptical agent: can assert/accept P iff she can construct a justified argument for P If part of protocol, then protocol has no public semantics!

PWA: protocol 1. W claims p; 2. B concedes if allowed by its attitude, if not claims -p if allowed by its attitude or else challenges p 3. If B claims -p, then goto 2 with players’ roles reversed and -p in place of p; 4. If B has challenged, then: a. W claims S, an argument for p; b. Goto 2 for each s  S in turn. 5. B concedes p if allowed by its attitude, or the dialogue terminates without agreement. Also: - no player repeats its own moves - if the ‘indicated’ move cannot be made (i.e., would repeat a move), the dialogue terminates Outcome: do players agree at termination?

The agents’ KBs P: airbag airbag  safe O: newspaper newspaper   safe

PWA: example dialogue (1) P: thoughtful/skeptical P1: claim safe O: careful/cautious P: airbag airbag  safe O: newspaper newspaper   safe

PWA: example dialogue (1) P: thoughtful/skeptical P1: claim safe O: careful/cautious P: airbag airbag  safe O: newspaper newspaper   safe + safe

PWA: example dialogue (1) P: thoughtful/skeptical P1: claim safe O: careful/cautious O1: concede safe P: airbag airbag  safe O: newspaper newspaper   safe + safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe O: thoughtful/skeptical P: airbag airbag  safe O: newspaper newspaper   safe + safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe O: thoughtful/skeptical O1: why safe P: airbag airbag  safe O: newspaper newspaper   safe + safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} O: thoughtful/skeptical O1: why safe P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} O: thoughtful/skeptical O1: why safe O2: why airbag P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} P3: claim {airbag} O: thoughtful/skeptical O1: why safe O2: why airbag P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} P3: claim {airbag} O: thoughtful/skeptical O1: why safe O2: why airbag O3: why airbag  safe P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

PWA: example dialogue (2) P: thoughtful/skeptical P1: claim safe P2: claim {airbag, airbag  safe} P3: claim {airbag} P2: claim {airbag  safe} O: thoughtful/skeptical O1: why safe O2: why airbag O3: why airbag  safe P: airbag airbag  safe O: newspaper newspaper   safe P1: + safe P2: + airbag, airbag  safe

PWA: example dialogue (3) P: thoughtful/skeptical P1: claim safe O: confident/skeptical P: airbag airbag  safe O: newspaper newspaper   safe + safe

PWA: example dialogue (3) P: thoughtful/skeptical P1: claim safe P2: why  safe O: confident/skeptical O1: claim  safe P: airbag airbag  safe O1: +  safe O: newspaper newspaper   safe P1: + safe

PWA: example dialogue (3) P: thoughtful/skeptical P1: claim safe P2: why  safe P3a: why newspaper P3b: why newspaper   safe O: confident/skeptical O1: claim  safe O2: claim {newspaper, newspaper   safe} O3a: claim {newspaper} O3b: claim {newspaper   safe} P: airbag airbag  safe O1: +  safe O2: + newspaper, newspaper   safe O: newspaper newspaper   safe P1: + safe

PWA: characteristics Protocol multi-move (if 4a is breadth-first) (almost) unique-reply Deterministic in Lc Dialogues Short (no stepwise construction of arguments, no alternative replies) Only one side develops arguments Logic used for single agent: check attitudes and construct argument Commmitments Used for attitudes and outcome