Download presentation
Presentation is loading. Please wait.
Published byHarriet Blair Modified over 8 years ago
1
Methods of Argument Evaluation CRRAR University of Windsor October 9, 2014 Douglas Walton
2
Abstract Even though tools for identifying and analyzing arguments are now in wide use in the field of argumentation studies, so far there is a paucity of resources for evaluating real arguments, aside from using only deductive logic or Bayesian rules that apply to inductive arguments. In this paper it is shown that recent developments in artificial intelligence in the area of computational systems for modeling defeasible argumentation reveal a different approach that is currently making interesting progress. It is shown using two examples how these systems provide a practical method of argument evaluation that can be applied to everyday conversational arguments and legal arguments.
3
Bayesian Argument Evaluation The law widely known as Bayes’ rule of conditional probability can be derived from the rules for conjunction and negation. Pr(A│B)=(Pr(B│A) x Pr(A))/Pr(B) This method uses subjective beliefs of a single agent. Using this rule, the probability of A given B can be calculated from knowing the probability of B given A, along with prior values of A and B. For example, if a source is thought to be an expert the statement it claims to be true will increase the probability of the conclusion, whereas if the source is thought to be biased it will decrease the probability of the conclusion.
4
The Conjunction Fallacy According to the Bayesian rule, the conjunction of two statements A and B is less than the probability of either A or B individually. But Tversky and Kahneman (1982) gave the following example to groups of the respondents. Linda is a 31-year-old outspoken and very bright bank teller who majored in philosophy. As a student she was concerned with issues of social justice, and she participated in antinuclear demonstrations. Which of two statements is more probable: (1) Linda is a bank teller, or (2) Linda is a bank teller and is active in the feminist movement? Most of those asked the question chose (2) as their answer. This was taken to suggest either that the respondents were illogical, according to the standard conjunction rule for statistical reasoning, or that the conjunction rule must somehow be wrong.
5
The DefLog System Bart Verheij in 1999 began to build an automated argument assistant called ArguMed that helps a user to construct an argument diagram in order to analyze and evaluate a given argument (Verheij, 2003, 320). The development of this system was guided by his parallel research on the logical system called DefLog (Verjeij, 2003) that represents the underlying structure of the reasoning displayed by ArguMed. ArguMed (http://www.ai.rug.nl/~verheij/aaa/argumed3.htm) is available at no cost on the Internet. The ultimate conclusion, called the issue in ArguMed, appears in a text box at the top of the diagram, and the premises supporting it form arguments that can be linked together, and that lead by arrows to the conclusion. The user can input statements directly into the boxes shown on the screen in a way comparable to other argument mapping tools.
6
An Example of an ArguMed Map
7
The System ASPIC+ ASPIC+ is built on a logical language containing a contrariness function, a set of strict and defeasible inference rules, and an ordering of this set of inference rules (Prakken, 2010). Suppose that we have the goal of increasing productivity because we think that increasing productivity is good. And suppose we think that reducing taxes is the way to increase productivity. Then we should conclude that taxes should be reduced. However we also accept the premises that reducing taxes increases inequality, that increasing inequality is bad, and that equality is more important than productivity. Then we should conclude that we should not reduce taxes. The second argument defeats the first one, even though the conclusion of the first argument was acceptable before the second argument was considered.
8
The Carneades Argumentation System (CAS) Thomas F. Gordon (Fraunhofer Fokus) There are over 60 software packages to assist with argument diagramming (Scheuer et al., 2010), based on informal logic models of argument. Carneades provides a number of software tools, including argument diagramming, based on a formal, computational model of argument. Carneades is named after the Greek skeptical philosopher (Gordon and Walton, 2006) and is open source software, available at http://carneades.github.com/. http://carneades.github.com/
9
Choosing a Scheme in CAS
10
Argument from Expert Opinion Major Premise: E is an expert in domain D. First Minor Premise: E asserts that A is true. Second Minor Premise: A is within D. Conclusion: A may tentatively be accepted as true.
11
Argument Diagrams in CAS In Carneades argument maps, statement nodes are shown as propositions in text boxes. Argument nodes are displayed as circles, with a + or – sign inside the circle, to distinguish pro and con arguments, respectively. Premises and conclusions are visualized as lines and arrows, respectively, connecting statement and argument nodes.
12
Linked Argument
13
yEd Diagramming Tool (GraphML) CAS is still under development, and not yet easy for novices to use, so here we use a diagramming tool that anyone can easily use. The system yEd is available for free on the Internet, and can be downloaded at this site: http://www.yworks.com/en/products_yed_ab out.html http://www.yworks.com/en/products_yed_ab out.html yEd is compatible with CAS.
14
The Vermeer Example In a forensic investigation of some potentially valuable fine art, the dispute is about whether a particular painting is a genuine Vermeer. One party to the dispute, the proponent, who claims that the painting is a genuine Vermeer by citing some expert opinion evidence. She says that judging a Vermeer painting to be genuine falls under the field of art history, and Alice an expert in art history, says that the painting is a genuine Vermeer. The other party to the dispute, the respondent, denies that the painting is a genuine Vermeer, and advances an argument to support his contention. He agrees that judging a Vermeer painting to be genuine falls under the field of art history, but cites the opinion of another expert in art history, Bob, who has claimed that the painting is not a genuine Vermeer. Here we have a pair of arguments, each one being an argument from expert opinion, that are deadlocked.
15
Vermeer Step 1
16
Undercutters and Rebutters All three AI systems are based on the distinction (Pollock, 1995) between two kinds of refutations called rebutting defeaters and undercutting defeaters (often referred to as rebutters versus undercutters). A rebutter gives a reason for denying a claim by arguing that the claim is false whereas an undercutter defeater attacks the inferential link between the claim and the reason supporting it (Pollock, 1995, 40).
17
Famous Example of an Undercutter Pollock used a famous example (1995, 41) to illustrate his distinction. According to the example, if I see an object that looks red to me, that is a reason for thinking it is red, but if I know that the object is illuminated by a red light, that is an undercutter, because red objects look red in red light too. This is not a reason for thinking the object is not red, but it is a reason for undercutting the argument that it is red. Despite the attacking argument, the object may be red, for all we know. In the three AI systems, there are the three kinds of basic counterarguments: undercutters, rebutters and premise attacks.
18
Vermeer Step 2
19
Vermeer Step 3
20
Vermeer Step 4
21
The Antiphon Example: Part 1 The following ancient Greek case was analyzed as an example of plausible reasoning in (Walton, Tindale and Gordon, 2014, 90). A slave identified the killer of a man who had been murdered, before himself dying of blows suffered during the assault. The slave had been accompanying the man after both of them had returned from a banquet. Before dying, the slave identified a known enemy of the murdered man as the perpetrator. In court, the prosecutor used the following arguments from plausibility. He argued that professional criminals would not have killed this man, because the victims were found still wearing their cloaks. He argued that it is not plausible that someone from the banquet killed him, since he would be identified by his fellow guests. He argued that it is not plausible that the man was killed because of a quarrel, because people would not quarrel in the dead of night and in a deserted spot. In this part of his argument, the prosecutor argued by set up three hypotheses offering different explanations of who committed the murder, and argued that each of them is implausible.
22
The Antiphon Example: Part 2 The prosecutor produced additional evidence indicating that the defendant identified by the slave was the murderer (Walton, Tindale and Gordon, 2104, 91). This factual evidence was that in the past the murdered man had brought several lawsuits against the defendant and the defendant had lost all of them at great personal expense. The prosecutor argued that the defendant bore a grudge against the victim and that for this reason it was natural for him to plot against the victim and to kill him. To sum up his case, the prosecutor argued “Who is more likely to have attacked him than an individual who had already suffered great injuries at his hands and could expect to suffer greater ones still?” (Diels and Kranz, 87 B1: 2.1.5).
23
Antiphon Step 1
24
Antiphon Step 2
25
Antiphon Step 3
26
Antiphon Step 4
27
Outcome The outcome of the complete evaluation is that all the arguments in the case, once marshaled together in the way shown in steps 1-4, provide enough evidence to prove the conclusion that is plausible that D murdered V.
28
Strength of Arguments
29
Breaking Deadlocks
30
Six Basic Skills Required Applying the general method of argument evaluation outlined in this talk is comparatively easy. It requires only six basic skills: (1) a knowledge of argumentation schemes, (2) an ability to represent the structure of a given argument using an argument diagram in the form of a directed graph, (3) the ability to apply the schemes to the diagram, (4) the ability to fill in gaps in the diagram created by implicit premises or conclusions, (5) the ability to use the device of the audience as applied to the given case to judge which premises are accepted by the audience. (6) the ability to recognize and apply differing standards of proof to the argumentation in a given case may be needed in some instances, as in the Antiphon example.
31
A Practical Method When working on an example argument to be evaluated a practical method is to start by drawing a rough argument diagram with pencil and paper. Then later use a mapping tool or even a simple graph drawing software to build a more refined version that can be stored, reused, sent to others and later modified. There are drawing tools available that are easy to learn about and use such as Microsoft Visio, yEd, Gliffy, LucidChart, and so forth, and some are free. These tools can be used to draw quite a presentable argument diagram compatible with the computational argument evaluation systems currently being built in artificial intelligence as described in this talk.
32
A Few References Gordon, T. F. (2010). The Carneades Argumentation Support System, Dialectics, Dialogue and Argumentation, ed. C. Reed and C. W. Tindale, London: College Publications. Gordon, T. F., and Walton, D. (2009). Proof Burdens and Standards. In Argumentation in Artificial Intelligence, I. Rahwan and G. Simari, Eds. Springer- Verlag, Berlin, Germany, 2009, 239-260. Modgil, S. and Prakken, H. (2014). The ASPIC+ Framework for Structured Argumentation: A Tutorial, Argument and Computation 5 (1), 31-62. Pollock, J. L. (1995). Cognitive Carpentry. Cambridge, Mass.: The MIT Press. Prakken, H. and G. Sartor (1997). Argument-based Extended Logic Programming with Defeasible Priorities, Journal of Applied Non-classical Logics, 7, 25-75. Scheuer, O., Loll, F., Pinkwart, N. and McLaren, B. M. (2010). Computer-supported Argumentation: A Review of the State of the Art. Computer-Supported Collaborative Learning, 5 (1), 43–102. Verheij, B. Virtual Arguments: On the Design of Argument Assistants for Lawyers and Other Arguers. The Hague: Asser Press. Walton, D., Tindale, C. W. and Gordon, T. F. (2014). Applying Recent Argumentation Methods to Some Ancient Examples of Plausible Reasoning, Argumentation, 28(1), 85-119.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.