Amparo Urbano (with P. Hernandez and J. Vila) University of Valencia. ERI-CES Pragmatic Languages with Universal Grammars: An Equilibrium Approach.

Slides:



Advertisements
Similar presentations
Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
Advertisements

Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
GAME THEORY.
Cyclic Code.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Congestion Games with Player- Specific Payoff Functions Igal Milchtaich, Department of Mathematics, The Hebrew University of Jerusalem, 1993 Presentation.
Two-Player Zero-Sum Games
EC3224 Autumn Lecture #04 Mixed-Strategy Equilibrium
Game Theory 1. Game Theory and Mechanism Design Game theory to analyze strategic behavior: Given a strategic environment (a “game”), and an assumption.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 15 Game Theory.
This paper reports an experimental study based on the popular Chinos game, in which three players, arranged in sequence, have to guess the total number.
Chapter 6 Information Theory
An Introduction to Game Theory Part II: Mixed and Correlated Strategies Bernhard Nebel.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Fundamental limits in Information Theory Chapter 10 :
Games as Systems Administrative Stuff Exercise today Meet at Erik Stemme
An Introduction to Game Theory Part III: Strictly Competitive Games Bernhard Nebel.
Query Incentive Networks Jon Kleinberg and Prabhakar Raghavan - Presented by: Nishith Pathak.
2-1 Sample Spaces and Events Conducting an experiment, in day-to-day repetitions of the measurement the results can differ slightly because of small.
Grammar induction by Bayesian model averaging Guy Lebanon LARG meeting May 2001 Based on Andreas Stolcke’s thesis UC Berkeley 1994.
Extensive Game with Imperfect Information Part I: Strategy and Nash equilibrium.
Channel Polarization and Polar Codes
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
MAKING COMPLEX DEClSlONS
Bayesian and non-Bayesian Learning in Games Ehud Lehrer Tel Aviv University, School of Mathematical Sciences Including joint works with: Ehud Kalai, Rann.
ECO290E: Game Theory Lecture 12 Static Games of Incomplete Information.
CY2G2 Information Theory 5
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Classifying Attributes with Game- theoretic Rough Sets Nouman Azam and JingTao Yao Department of Computer Science University of Regina CANADA S4S 0A2
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Channel Capacity.
§2 Discrete memoryless channels and their capacity function
Extensive Games with Imperfect Information
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
HYMES (1964) He developed the concept that culture, language and social context are clearly interrelated and strongly rejected the idea of viewing language.
DIGITAL COMMUNICATIONS Linear Block Codes
Chapter 31 INTRODUCTION TO ALGEBRAIC CODING THEORY.
Universal properties of language From An Introduction to Language and Linguistics (Fasold & Connor-Linton (editors), 2006, Yule, 2003)
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
Game Theory Optimal Strategies Formulated in Conflict MGMT E-5070.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
EC941 - Game Theory Prof. Francesco Squintani Lecture 6 1.
Information Technology Michael Brand Joint work with David L. Dowe 8 February, 2016 Information Technology.
ECO290E: Game Theory Lecture 3 Why and How is Nash Equilibrium Reached?
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Error Detecting and Error Correcting Codes
By: Donté Howell Game Theory in Sports. What is Game Theory? It is a tool used to analyze strategic behavior and trying to maximize his/her payoff of.
Practical Session 10 Computer Architecture and Assembly Language.
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
The Viterbi Decoding Algorithm
Computer Architecture and Assembly Language
Communication Networks: Technology & Protocols
Design of Multiple Antenna Coding Schemes with Channel Feedback
Arbitration and Mediation
Arbitration and Mediation
Trellis Codes With Low Ones Density For The OR Multiple Access Channel
Game Theory in Wireless and Communication Networks: Theory, Models, and Applications Lecture 2 Bayesian Games Zhu Han, Dusit Niyato, Walid Saad, Tamer.
Communication Amid Uncertainty
RS – Reed Solomon List Decoding.
Pragmatics of Persuasion
Equlibrium Selection in Stochastic Games
Multiagent Systems Repeated Games © Manfred Huber 2018.
Team Dont Block Me, National Taiwan University
EASTERN MEDITERRANEAN UNIVERSITY DEPARTMENT OF INDUSTRIAL ENGINEERING IENG314 OPERATIONS RESEARCH II SAMIR SAMEER ABUYOUSSEF
Computer Architecture and Assembly Language
Information Theoretical Analysis of Digital Watermarking
Presentation transcript:

Amparo Urbano (with P. Hernandez and J. Vila) University of Valencia. ERI-CES Pragmatic Languages with Universal Grammars: An Equilibrium Approach

Motivation  Economic agents communicate to reduce uncertainty and achieve coordination in either complete or incomplete information frameworks.  Language is a central tool in the process of making decisions  Most of the times, communication is noisy. Information transmission may involve different sources of misunderstanding: Cultural, different mother tongues, Different specialization field (marketing, finance,….) Non verbal (unconscious) communication  However, the equilibrium approach to communication misunderstandings is not too widespread.

Common dictionary or corpus ACB Speaker/senderHearer/receiver

Noiseless communication B B ACB Speaker/senderHearer/receiver

Noisy communication ACB B C P( | C) Hearer/receiverSpeaker/sender P( A | B) P( B | B) P( C | B)

Inference of meaning A AAAAAA CCCCCC BBBBBB B BBBBBB AABAAB {A,B,C} 3 AABAAB Speaker/sender Hearer/receiver P( A | B) P( B | B) P( C | B) P( A | B) P( B | B) P( C | B) P( A | B) P( B | B) P( C | B)

Pragmatic inference of meaning {A,B,C} 3 The partition of the message space does not only depend on the transition probabilities but also on the context of the communication episode Our context is a Sender-Receiver game The message space is partitioned by a BEST RESPONSE criterion

Agenda We construct pure strategies in a Sender-Receiver game with noisy information transmission, based on:  Coding and inference of meaning rules (pragmatic Language).  The coding has a universal grammar and the meaning inference model is a partition of the message space.  We characterize the hearer/receiver best response in terms of some vicinity bounds in a pragmatic way.  We measure how much the communicative agents depart from noiseless information transmission equilibrium payoffs.  We calculate the minimum length of the communicative episode to guarantee any efficiency payoff approximation. 

The basic model: The sender-receiver game Γ Let be a set of states of nature. We have a game defined by:  A set of two players:  A set of actions for player R:  A payoff function for both players, given by: ASSUMPTION: is an aligned interest game For each state we have an action such that

Noisy channel Input basic signalsOutput basic signals  Players communicate with noise. We follow an unifying approach and consider a discrete noisy channel to model general misunderstandings that may appear in information transmission. Input sequence Output sequence n-time Com. x y

The Extended communication game  Natures chooses a state w j with probability q j  S is informed of the actual state.  S utters an input sequence of length n to R, through the noisy channel.  R hears an output sequence of length n, and chooses an action accordingly (infers a meaning).  Payoffs are realized GAME : communication length n. Messages are i.i.d. variables

Strategies of the extended game SENDER: where RECEIVER: where

Our construction: Corpus and pragmatic variations We construct pure strategies based on a pragmatic Language. This language consists of:  A Corpus or set of standard prototypes (sequences of basic signals which are one-to one with the set of sender's meanings=actions) The specific structure of the prototypes is defined by a grammar  Pragmatic variations of each standard prototype: output sequences from which the receiver will infer the meaning associated to the corresponding prototype Each sequence is assigned to a particular pragmatic variation in terms of its “vicinity” to the standard prototypes.

i-th block of the i-th standard prototype is formed with 0’s Block coding grammar: the corpus

Why this specific corpus?  Universal: It does not depend on the parameters of game Γ (initial probabilities and payoffs) It can be applied to any sender-receiver game It enables an easy characterization of the receiver’s pragmatic variations in terms of the Hamming distance, only depending of both the game and noise parameters of any sender-receiver game. (We have also characterized the pragmatic variations for any feasible corpus grammar, but it depends on some features of the specific coding rule).

EXAMPLE: the sender-receiver game

EXAMPLE: the noisy channel Communication length: n = 6

EXAMPLE: the corpus

Vicinity measure: Hamming distance  To characterize the pragmatic variation sets, we need a measure of distance.  Linguistics uses Levenshtein distance as a measure of phonological distance between two corpora of phonetic data.  Given two n-strings x=x 1,x 2,…,x n and y=y 1,y 2,…,y n, the Hamming Distance between them is given by:  Let h b (x,y) be the Hamming distance between b-th blocks of sequences x and y

The receiver’s problem d(y) is the solution of the maximization problem:

Noise level Relative expected payoff loss of playing action l instead of action k Vicinity bound: largest number of errors permitted in blocks l and k to play action l instead of action k The Receiver: Pragmatic variations. The vicinity bounds

An interpretation of the vicinity bounds The minimum is associated to the maximum relative expected payoff loss of playing action l instead of action k:

The Receiver’s best response.

Vicinity bounds increase with relative expected payoffs EXAMPLE: vicinity bounds

EXAMPLE: pragmatic variations VICINITY BOUNDS PRAGMATIC VARIATIONS

EXAMPLE: pragmatic variations Utterances with meaning ‘action 1’ Utterances with meaning ‘action 2’ Utterances with meaning ‘action 3’

Main result Give an aligned interest sender-receiver game, a noisy channel and a finite communication length n, the strategies given by are a pure strategy Bayesian Nash equilibrium of the extended noisy communication game.

The sender’s truth-telling problem We must check that sender has no incentive to send a message different from when she knows that actual state of nature is

Probability of a correct meaning inference Efficiency of meaning inference Given a channel with, a length n of the communication episode, and game, then for all we have that: where and is a polynomial on the channel parameters such that The vicinity bound depends on both n and the relative expected payoff loss

Ex-ante expected payoff without noise Ex-ante payoffs efficiency Given, then for any length of the communication episode, we have that where are the ex-ante expected payoffs of the extended communication game, and

 We have constructed a pragmatic Language with a universal grammar in noisy information transmission situations.  We have shown that such a Language is an equilibrium language.  We have also shown that such a Language is an efficient inference of “meaning” model: in spite of initial misunderstandings, the hearer is able to infer with a high probability the speaker’s meaning  Therefore: Pragmatic languages with a short number of basic signals support coordination, even when misunderstandings may appear  Our analysis can be extended to explain the role of communication in specific situations such as communication in organizations, some types of advertisement, market research and sub-cultural languages among others Conclusions

Thank you