Download presentation
Presentation is loading. Please wait.
Published bySherilyn Gladys Gallagher Modified over 9 years ago
1
1 Issue-based Dialogue Management Thesis
2
2 overview of thesis contents 1.Introduction 2.Basic issue-based dialogue management 3.Grounding Issues 4.Adressing Unraised Issues 5.Action-oriented and Negotiative Dialogue 6.Conclusions
3
3 1. Introduction; goals of thesis explore and implement issue-based dialogue management –starting from Ginzburg’s theory and other relevant theories (Lewis, Allwood, Clark, Traum, Sidner, …) –adapt to dialogue system and implement –extend theory (incl. accommodation, action-oriented dialogue, negotiation) separate general and domain-dependent phenomena –theoretical: general theory of dialogue –practical: minimize effort for adapting to new domains incrementally extending system to handle increasingly complex types of dialogue –theoretical: clarifies relation between dialogue types –practical: reuse of update rules
4
4 2. Issue-based dialogue management enquiry-oriented dialogue (database search) basis: –Ginzburg’s Dialogue Gameboard (DGB) and –related DGB update protocols moves: ask, answer, greet, quit raising and addressing issues –incl. short answers. e.g.”yes”, ”no”, ”paris”, ”in april” dialogue plans sample domain: travel agency extension: –reraising issues –handling multiple issues
5
5 Semantics ”FOL” without quantifiers,disjunction,conjunction Questions –Y/N-questions: ?P, P is a proposition –wh-questions: ?x.p(x) (p is a predicate) –alt-questions: {?P1, …, ?Pn} Content of short answers –individual markers: paris, april, … –yes, no
6
6 Semantics, cont’d Q-A relations (adapted from Ginzburg) –resolves(A,Q): A resolves Q dest-city(paris) resolves ?x.dest-city(x) –relevant(A,Q): A is relevant to Q (about Q) not(dest-city(paris)) is relevant to ?x.dest-city(x), but does not resolve it
7
7 basic infostate PRIVATE : PLAN : stack( Action ) AGENDA : OpenQueue( Action ) SHARED : BEL : set( Prop ) COM : set( Prop ) QUD : stack( Question ) LU: SPEAKER: Speaker MOVES: OQueue( Move ) + module interface variables INPUT : String LATEST-MOVES: Set(Move) LATEST-SPEAKER: Speaker NEXT-MOVES: Set(Move) OUTPUT: String + resource interface variables LEXICON : Lexicon DOMAIN : Domain DATABASE : Database
8
8 sample dialogue plan findout(?x.transport(x)) findout(?x.dest-city(x)) findout(?x.depart-city(x)) findout(?x.dept-month(x)) findout(?x.dept-day(x)) raise( {?class(economy), ?class(business)} consultDB(?x.price(x)) respond(?x.price(x))
9
9 Answer integration integrateAnswer Before an answer can be integrated by the system, it must be matched to a question on QUD pre: eff: in($/ SHARED/LU/MOVES, answer(A)) fst($/ SHARED/QUD, Q) $ DOMAIN :about(A, Q) ! DOMAIN: combine(Q, A, P) add(/ SHARED/COM, P)
10
10 basic dialogue with updates U: ”price information please”; raises price issue –if user asks Q, push respond(Q) on AGENDA –if respond(Q) on AGENDA and PLAN empty, find plan for Q and load to PLAN –if findout(Q) first on PLAN, ask Q S: ”where do you want to go?” U: ”Paris” –if LM=answer(A) and A about Q, add P=Q[A] to SHARED.COM –if P in SHARED.COM and Q topmost on QUD and P resolves Q, pop QUD –if P in SHARED.COM and P fulfils goal of findout(Q) and findout(Q) on PLAN, pop PLAN
11
11 basics cont’d … S: ”Do you want economy class or business class?” U: ”economy class” –if consultDB(Q) on PLAN, consult database for answer to Q; store result in PRIVATE.BEL –if Q on QUD and P in PRIVATE.BEL s.t. P resolves Q, answer(P) S: ”The price is £123”
12
12 Information sharing across plans ISIS does not keep track of when propositions were added, or which plan was being executed so information sharing is determined by question sharing across plans plan for VISA question: findout(?x.dest-city(x)) findout(?x.citizenship(x)) –shares a questions with plan for ?x.price(x) so if visa-issue raised after price-issue, no need to ask for destination again
13
13 dealing with multiple open issues if user asks Q, push Q on QUD and load plan for dealing with Q if users asks Q’ while system is dealing with Q, throw out plan for Q but Q remains on QUD; load plan for Q’ when Q’ resolved, Q topmost on QUD will trigger reloading plan for dealing with Q –general rule: if SHARED.COM contains info resolving Q, don’t ask Q –so any resolved questions in plan will be thrown out
14
14 Sample dialogue U: I want price information [raise ?x.price(x)] S: Where do you want to go? U: London S: When do you want to travel? QUD= U: Do I need a Visa? [raise ?visa] irrelevant followup to U’s question -> remove it (not assumed grounded); push new issue raised by U QUD= load plan for dealing with visa-issue S: Where are you travelling from? U: Gothenburg S: No, you don’t need a Visa.
15
15 Sample dialogue, cont’d S: No, you don’t need a Visa. visa-issue resolved, so pop off QUD QUD= PLAN empty, so reload plan for dealing with ?x.price(x) Throw out all question which have already been resolved; raise the first unresolved question on plan S: When do you want to leave? U: April [answer dept-month(april)] QUD= S: What day do you want to leave? …
16
16 pros and cons of this solution Possibly inefficient since system does not keep track of where it was in the plan before it was thrown out No way of knowing which issue is currently being dealt with, and thus whether a new plan should be loaded –possible solution: guarantee that plan always concerns topmost Q on ISSUES Alternative solution is to keep unfinished plans around –PLAN : Stackset(Pair(Question, Plan)) –However, going through the plan again may be useful in case e.g. some information was removed when dealing with the embedded issue –Possible solution is to keep track of overlap between plans –if Q’ is raised while dealing with Q, and the plans overlap, throw out plan for Q and reload it later –otherwise, keep partially executed plan for Q around
17
17 3. Grounding issues feedback types –action level: contact, perception, understanding, acceptance/integration –polarity: positive, negative, elliciting (interrogative) grounding issues –do I have contact with other DP? what did S say? what did S mean? does H accept what was said/meant? update strategies –optimistic non-cautious cautious –pessimistic feedback and grounding for a dialogue system
18
18 Feedback polarity polarity: positive, negative, interrogative Examples –”I don’t understand” negative –”Do you mean that the destination is Paris?” elliciting –”To Paris.” positive –”Pardon” negative
19
19 ICM dialogue moves Interactive Communication Management –feedback –sequencing –(turntaking) icm:Level{*Polarity}{:Args} –icm:sem*pos:String – ”I heard you say ’londres’” –icm:und*neg – ”Sorry, I don’t understand” –icm:und*int:AltQ – ”Do you mean x or y?” –icm:und*pos:P – ”To Paris.” –icm:acc*neg:Q – ”Sorry, I can’t answer Q” –icm:acc*pos – ”Okay” –icm:reraise:Q – ”Returning to the issue of Q” [sequencing ICM]
20
20 Realisation of ICM dialogue moves Form: –declarative: ”I didn’t hear what you said.”; ”The destination city is Paris.” –interrogative: ”What did you say?”; ”Do you want to go to Paris?” –imperative: ”Please repeat your latest utterance!” –elliptical interrogative: ”Paris?”, ”To Paris or from Paris?” declarative: ”To Paris.” eliciting is always interrogative (possibly elliptical)
21
21 Implicit feedback Clark: ”relevant followup” to U –what is relevant? simple cases for followups to questions: –answer to question –”subquestion” –feedback concering question in general, complex inference and knowledge may be needed (implicatures) –irrelevant followup counts as negative feedback What about no followup at all? –in reaction to ask-move or interrogative feedback, counts as negative –in reaction to answer or positive feedback, counts as positive
22
22 System feedback for user utterances contact –negative (”I didn’t hear anything from you.”) perception –negative: fb-phrase (”Pardon?”, ”I didn’t hear what you said”) –positive: repetition (”I heard ’to paris’”) understanding –negative: fb-phrase (”I don’t quite understand”) –positive: reformulation (”To Paris.”) –eliciting neutral: reformulation (”To Paris, is that correct?”, ”To Paris?”) acceptance/integration –negative:fb-phrase with reformulation (”Sorry, I cannot answer Q”, ”Sorry, Paris is not a valid destination city.”) –positive: fb-word (”okay.”)
23
23 User feedback for system utterances contact: - perception –negative: fb-phrase (”Pardon?”, ”I didn’t hear what you said”) understanding: - acceptance/integration –negative: fb-phrase (”I don’t know”, ”Never mind”) –positive: fb-word (”okay.”)
24
24 Feedback: action levels and associated metaissues assume A uttered U to B –A and B are faced with a number of issues contact: do A and B have contact? perception: –A: does B percieve U (correctly)? –B: did B say anything? / what did B say? / Did B say V? understanding: –A: does B understand U (correctly) –B: what did B mean? / Did B mean C? acceptance –A: does B accept U –B: should I accept U?
25
25 Grounding and action levels ”To ground a thing … is to establish it as part of common ground well enough for current purposes.” (Clark) grounding applies to all action levels U is grounded on level L = the answer to the grounding issue on level L is positively resolved grounding assumptions correspond to information state updates in system –contact, perception not explicitly modeled –understanding: SHARED.LU.MOVES –acceptance: SHARED.QUD, SHARED.COM
26
26 Grounding update strategies strategic questions: –When should U assumed to be grounded on level L? as soon as it has been uttered (of course, the hearer cannot assume grounding until grounding wh-issues have some answer, e.g. ”what did A say?” ) if B does not give negative feedback when B gives positive feedback when B has given eliciting feedback which has been confirmed by A –What to do if the grounding assumption turns out to be mistaken optimism on level L: –assume U is grounded on level L as soon as U has been uttered
27
27 Grounding update strategies cont’d optimism on level L: –assume U is grounded on level L as soon as U has been uttered cautious optimism: –make sure the optimistic assumption can easily be retracted pessimism: –don’t assume U grounded until there has been some positive feedback (or at least no negative feedback)
28
28 Meta-issue: understanding Ginzburg’s meaning question –?x.meaning(LU,x) –”What’s the meaning of LU?” understanding-issue –for speaker who uttered LU with move type m, content c –or hearer who interpreted LU –?und(m(c)) –”Is m(c) a correct interpretation of LU?
29
29 Optimistic approach to grounding assumption that answer to grounding questions are positive for system utterances –need to deal with cases where user indicates optimistic assumption is wrong –at least for perception and acceptance levels for user utterances –need to indicate failure, and on which action level –if fail to understand or accept, don’t modify SHARED
30
30 optimistic understanding update input inter- pret updateselect gene- rate output PRIVATE : PLAN : stackset( Action ) AGENDA : Queue( Action ) SHARED : BEL : set( Prop ) TMP : (same type as SHARED) COM : set( Prop ) QUD : stack( Question ) LU: SPEAKER: Speaker MOVES: OQueue( Move ) LATEST-MOVES: Set(Move) LATEST-SPEAKER: Speaker
31
31 Meta-issue: acceptance Ginzburg’s protocols for acceptance –LM = ask Q -> consider ?MAX-QUD(Q) if yes, push Q on QUD otherwise, address ?MAX-QUD(Q) –LM = assert P -> consider ?MAX-QUD(?P) if yes, consider ?P –if yes, add P to FACTS –otherwise, address ?P otherwise, address ?MAX-QUD(?P)
32
32 rejections ?MAX-QUD(Q) is answered ”no” –inability to answer Q ”Sorry, I can’t answer that question” –unwillingness to answer Q ”I don’t want to discuss that” ?MAX-QUD(?P) is answered ”no” –unwillingness to discuss whether ?P ”I don’t want to discuss that” –other reasons? ?P answered ”no” –”Sorry, I don’t agree.”, ”You’re wrong!”, ”That’s impossible!” a rejection may lead to argumentation
33
33 problematic cases S: ”Where do you want to go?” U1: ”Nowhere” U2: ”I don’t know” U3: (silence) OR ”I want first class!” do these count as rejections? –U1: negative answer? presupposition failiure? rejection? –U2: rejection? but not as definite as ”No comment!” –U3: rejection? in any case, irrelevant followup
34
34 optimistic acceptance update PRIVATE : PLAN : stackset( Action ) AGENDA : stack( Action ) SHARED : BEL : set( Prop ) TMP : (same type as SHARED) COM : set( Prop ) QUD : stack( Question ) LU: SPEAKER: Speaker MOVES: assocSet( Move )
35
35 choice of strategies in system system utterances –optimistically assumed to be grounded on all levels –negative feedback on perception or acceptance levels -> backtrack to saved state user utterances –if problem on any level, give negative fb –if OK on all levels, update strategy and feedback determined by recognition score –S > 0.9 optimistic update, icm:acc*pos –0.9 >= S > 0.8 optimistic update, icm:acc*pos, icm:und*pos:Content –0.8 >= S >0.5 pessimistic update, icm:und*int:Content if ?und(Content) recieves answer ”yes”, add assume Content grounded
36
36 optimistic acceptance assume positive answer to acceptance issue we don’t need to represent these issues explicitly –no use representing them unless we can handle argumentation subdialogues to resolve disagreement for system utterances –need to deal with cases where user indicates optimistic assumption is wrong –so far, only for system questions for user utterances –need to indicate when optimistic assumption is wrong –both questions (sys has no plan) and propositions (invalid database parameter)
37
37 4: Addressing Unraised Issues ISSUES and QUD answer integration question accommodation (to QUD) issue accommodation (to ISSUES) reraising issues multiple issues: modified account information sharing across plans reacommodation (reraising by accomm.) transitive reaccommodation and reraising
38
38 problem with QUD If QUD= and q1 is resolved, q2 is available for resolution of short answers –takes no account of how many turns since q2 was raised –but short answers a long distance away from the question are not as easily processed as an adjacent answer
39
39 ISSUES and QUD We extend Ginzburg’s DGB by adding ISSUES of type Stack(Question) ISSUES contains all raised but unresolved questions –ISSUES determines relevance of user answers QUD used for resolving short answers –questions drop off QUD after N turns –a short answer to a question that’s on ISSUES but not QUD requires QUD accommodation!
40
40 short answer integration If –LM=answer(A); A is a short answer –Q topmost on QUD –A about Q then –P = Q[A] –add P to SHARED.COM QUD downdate: if Q topmost on QUD and P in SHARED.COM s.t. resolves(P,Q), pop QUD
41
41 full answer (”assertion”) integration If –LM=answer(A); A is a proposition –Q in ISSUES –A about Q then –add A to SHARED.COM Issue downdate: if Q on ISSUES and P in SHARED.COM s.t. P resolves Q, remove Q from ISSUES
42
42 issue accommodation PLANISSUES If –LM=answer(A) –no Q in ISSUES s.t. about(A,Q) then –find findout(Q) in PLAN s.t. about(A,Q) –push Q on ISSUES used when prevously unraised question (available in plan) is answered using a short or full answer
43
43 question accommodation ISSUESQUD If –LM=answer(A) –no Q in QUD s.t. about(A,Q) then –find Q in ISSUES s.t. about(A,Q) –push Q on QUD –raise Q in ISSUES (make Q topmost) used when –previously raised question has dropped off QUD, but is answered using a short answer –previously unraised question is answered using short answer [needs PLANISSUES accommodation]
44
44 dependent issue accommodation DOMAIN ISSUES (+PLAN) If –LM=answer(A) –no Q in ISSUES s.t. about(A,Q) –no findout(Q) in PLAN s.t. about(A,Q) then –find Plan for some Q’ in DOMAIN s.t. findout(Q) or raise(Q) in Plan and about(A, Q) –push Q’ on ISSUES –set PLAN to Plan used when previously unraised question, unavailable in PLAN, is answered using full or short answer
45
45 Question reraising selection ISSUES QUD if –Q on ISSUES –Q not on QUD –PLAN empty –system has no plan for dealing with Q then –set next move to ask(Q) –indicate reraising: ”so…” [sequencing ICM] assumption: questions on QUD have more attention than those merely on ISSUES Need to deal with state where Q is topmost on ISSUES but not on QUD; this indicates Q needs to be reraised NOTE: this is a selection rule
46
46 Sample dialogue S: hello U: London please transitive issue accommodation: push on ISSUES ?x.price(x) load plan for ?x.price(x) issue accommodation: push on ISSUES ?x.dest-city(x) question accommodation: push on QUD ?x.dest-city(x) integrate answer S: Alright. Let’s see. Okay. What month do you want to travel? ISSUES = U: Right… Do I need a Visa? [raise ?visa] ISSUES= S: Let’s see. Where are you travelling from? U: Gothenburg S: Okay. No, you don’t need a Visa. ISSUES= PLAN empty; QUD empty
47
47 Sample dialogue, cont’d S: No, you don’t need a Visa. ISSUES= PLAN empty, so reload plan for dealing with ?x.price(x) QUD empty reraise ?x.dept-month(x) S: Returning to the issue of price. Let’s see. So, what month do you want to travel? U: April ISSUES= Throw out all question which have already been resolved; raise the first unresolved question on plan S: Okay. April. What day do you want to leave?
48
48 Issue reraising SHARED.COM ISSUES If –LM=ask(Q) –there is a P in SHARED.COM s.t. about(P,Q) then –push Q on ISSUES –remove P from SHARED.COM used when previously resolved question is asked again reraising should be inicated; ”so,…”; reformulation may be needed
49
49 Issue reaccommodation SHARED.COM ISSUES If –LM=answer(A) –no Q in ISSUES s.t. about(A,Q) –P in SHARED.COM s.t. there is a Q s.t. about(A,Q) and about(P,Q) then –push Q on ISSUES –remove P from SHARED.COM Used when previously resolved question is answered again
50
50 a problem What if user, after system has given a price, changes his mind? I.e., a previously resolved question is answered again, and this question influences another resolved question Q influences Q’ if findout(Q) is in the plan for resolving Q’ example: –?x.dest-city(x) influences ?x.price(x)
51
51 Transitive issue reaccommodation SHARED.COM ISSUES If –LM=answer(A) –no Q in ISSUES s.t. about(A,Q) –there is a Q’ s.t. Q influences Q’ –there’s a P in SHARED.COM s.t. about(A,Q) and about(P,Q) –P’ in SHARED.COM is about Q’ then –push Q’ on ISSUES –remove P’ from SHARED.COM –push Q on ISSUES –remove P from SHARED.COM After this, A can be integrated into SHARED.COM, the plan for Q’ will be loaded and since all questions are already answered, the only result will be a new database search
52
52 sample dialogue … S: When do you want to travel? U: In April … S: The price is £150 –ISSUES = <> U: Hmm, what about June [answer(june)] –transtitive issue reaccommodation; –ISSUES= –integrate answer(june) –redo database search for ?x.price(x) S: The price is £200
53
53 5: Issues and goals in Action-oriented dialogue each goal action associated with a dialogue plan add SHARED.ACTIONS : OpenStack(Action) New moves: request(Action), ¨report(Action, Status) ACTIONS has a similar role to ISSUES –Problem: how coordinate ISSUES and GOALS? –simple solution is to prioritize issues –otherwise, connect each issue to one or more actions adapt accommodation strategies to AOD
54
54 integrating requests If –L-M is request(A) –$DOMAIN: Plan(A, _Plan) then –push(SHARED.ACTIONS, A) (findPlan rule will load the plan)
55
55 Requests vs. answers requests adresses a general question –”what shall I do next?”, ”what can I do for you?” or similar (”prompt”) –semantics in AOD: ?x.action(x) dialogue –”What can I do for you?” raises?x.action(x) –”Search the phonebook” interpreted as request(search_phonebook) Rule: –If ?x.action(x) is topmost on ISSUES and L- M is request(a) –then pop ISSUES
56
56 Questions vs. answers a question can be regarded as a special type of request so in a specific sense, questions are also answers… questions adresses a general question –”what issue shall I address next?”, ”what can I do for you?” or similar (”prompt”) –semantics: ?x.issue(x) –this question is not presupposed semantics of ”How can I help you?” in IOD is ask(?x.issue(x))
57
57 Pure AOD and mixed AOD+IOD Pure AOD –core moves: instruct, report –fairly limited AOD+IOD –core moves: ask, answer, instruct, report
58
58 Action Oriented Dialogue subtypes ”Command dialogue” –user instructs system to perform actions device control travel agency booking ”Instructional dialogue” –system instructs user to perform actions ”interactive manual” –system can explaing how to perform some action joint planning –user and system jointly agree to a plan –also involves negotiation
59
59 Converting menus to plans MenuPlan multi-choice list P 1, P 2,..., P n findout({?P 1,?P 2,...,?P n }) tick-box +/-Pfindout(?P) dialogue window p=___ findout(?x.p(x)) pop-up message M inform(M)
60
60 Menu-based AOD S: Welcome to the telephone and VCR manager! U: Channel one today S: Right, I assume you want to add a program. Let’s see. Okay. Channel one. Today. What time do you want to start recording? S: What time is it? U: It’s one forty five S: Set the clock to two forty five U: Okay, set the clock. Two forty five, is that correct? S: Yes. U: Okay. The clock was set. Returning to timer recording. So, what time do you want to start recording? U: eight thirty to nine thirty S: Okay. Starting at eight thirty. Ending at nine thirty. The program has been added.
61
61 Issues under negotiation Sidner: a formal account of negotiative dialogue an alternative account based on Issues Under Negotiation proposals and alternative-questions Acceptance, downshift, and negotiation sample domain: travel agency, negotiating flights
62
62 Negotiation vs. acceptance Dialogue action levels (Allwood, Clark): –1. A attends to B’s utterance –2. A percieves B’s utterance –3. A understands B’s utterance –4. A reacts to (accepts or rejects) B’s utterance Sidner and others sees negotiative dialogue as proposals and acceptance/rejections of proposals –this means that all dialogue is negotiative –all assertions (and questions, instructions etc.) are proposals But some dialogues are negotiative in another sense, by explicitly containing discussions about different solutions to a problem, and finally deciding on one –Negotiation in this sense is not action level 4
63
63 Two senses of “negotiation” Negotiation in “negotiating utterances” sense –A: flights to Oslo [propose proposition] –B(1): okay [accept proposition] –B(2): sorry, there are no flights to Oslo [reject prop.] Negotiation in “negotiating alternatives” sense –U: flights to paris on april 13 please [answer] –S: there is one flight at 07:45 and one at 12:00 [propose] –U: what airline is the 12:00 one [ask] –S: the 12:00 flight is an SAS flight [answer] –U: I’ll take the 12:00 flight please [accept]
64
64 Issues Under Negotiation in negotiative dialogue IUN is question e.g. what flight to take In an activity, some questions are marked as negotiable issues –other questions are assumed to be non- negotiable, e.g. the user’s name in a travel agency setting Each IUN is associated with a set of proposed answers –ISSUES : set(pair(question,set(answer)))
65
65 Alternatives in negotiation Alternatives are possible answers to an IUN a proposal has the effect of introducing a new alternative to the Issue Under Negotiation An IUN is resolved when an alternative is decided on, i.e. when an answer to it is accepted In some cases, the answer to IUN may consist of a set of alternatives (e.g. when buying CDs) cf Traum: –task annotated with negotiation objects –we have issue “annotated with” alternatives (answers)
66
66 Example IUN is ?x.sel_flight(x) (“which is the user’s desired flight”?) A: flight to paris, december 13 –answer(dest(paris)) etc.; B: OK, there’s one flight leaving at 07:45 and one at 12:00 –propose(f1), propose(f2), –answer(dep_time(f1,07:45)), answer(dep_time(f2,12:00)).... A: I’ll take the 07:45 one –answer(sel_flight(X) & dep_time(X, 07:45)), –after contextual interpretation: answer(sel_flight(f1))
67
67 PRIVATE = PLAN = AGENDA = { findout(? x.sel_flight(x)) } SHARED = findout((? x. ccn(x)) book_ticket COM = dep_time(f1,0745), dep_time(f2,1200) dest(paris),... QUD = <> LM = {propose(f1), propose(f2), answer(dep_time(f1,07:40),...} BEL = { flight(f1), dep_time(f1,0745),... } TMP = (same structure as SHARED) IUN = B: OK, there’s one flight leaving at 07:45 and one at 12:00
68
68 Example: non-exclusive alternatives U: A pizza please S: What fillings do you want? You can get choose cheese, garlic, olives and more. U: I want cheese S: Okay, cheese U: Is the garlic fresh? S: yes U: OK, garlic too then S: Okay, garlic. Anything else? U: no please S: Okay, one pizza coming up
69
69 Proposals and alternative- questions wh-question + proposals has same effect as asking alt-question –”Do you want to travel in economy class or business class?” [ask alt-q] –”How do you want to travel? One option is economy class. The other option is business class.” [ask wh-q; propose(economy), propose(business)] alternate representation of alt-q’s –{?P(a1), …, ?P(aN)} –?x.P(x)-{a1, …, aN} –partial functions for format conversion
70
70 ”Downshift” original idea (Cohen 1978) –”gears” in dialogue: ”games per turn” high gear = many latent subgames per turn –originally used to describe shifting to e.g. referent identification subdialogues –”latent” referent identification game becomes manifest; c.f. finding answer to content-question extension to acceptance –If some content is rejected (on grounding level), negotiation may ensue –latent acceptance game becomes manifest –c.f. finding answer to acceptance questions –this is where an issue assumed to be non-negotiable turns becomes negotiable
71
71 sample dialogue A: Where do you want to travel? B: Paris [answer(paris)] –B optimistically assumes dest-city(paris) in SHARED.COM A: Sorry, there are no flights to Paris. How about Marseille or Lyons? –question is reraised and is opened for negotiation; two proposals added –ISSUES=
72
72 Argumentative (non-collaborative?) negotiative dialogue argumentation for/against alternatives (not just exchanging ingformation about them) requires dialogue moves for argumentation (rhetorical moves) mostly relevant when DPs must decide something jointly need to represent DP’s stance towards alternatives + arguments for and against
73
73 Negotiation does not necessarily concern future actions S: What’s the name of Jim’s sister? U: Sue S: Are you sure? I think it was Jane issues = U: Isn’t Jane the name of Jack’s sister? S: Oh right, Sue then.
74
74 6. Conclusions Classifying dialogue Issues and dialogue structure Summary
75
75 Classifying dialogue (tables)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.