Presentation is loading. Please wait.

Presentation is loading. Please wait.

Instructor: Nick Cercone CSEB -

Similar presentations


Presentation on theme: "Instructor: Nick Cercone CSEB -"— Presentation transcript:

1 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Final HPSGs Cleaning up and final aspects, semantics, overview to statistical NLP Instructor: Nick Cercone CSEB -

2 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs An Overlooked Topic: Complements vs. Modifiers • Intuitive idea: Complements introduce essential participants in the situation denoted; modifiers refine the description. • Generally accepted distinction, but disputes over individual cases. • Linguists rely on heuristics to decide how to analyze questionable cases (usually PPs). Instructor: Nick Cercone CSEB -

3 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs Heuristics for Complements vs. Modifiers • Obligatory PPs are usually complements. • Temporal & locative PPs are usually modifiers. • An entailment test: If X Ved (NP) PP does not entail X did something PP, then the PP is a complement. Examples – Pat relied on Chris does not entail Pat did something on Chris – Pat put nuts in a cup does not entail Pat did something in a cup – Pat slept until noon does entail Pat did something until noon – Pat ate lunch at Bytes does entail Pat did something at Bytes Instructor: Nick Cercone CSEB -

4 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs Agreement • Two kinds so far (namely?) • Both initially handled via stipulation in theHead-Specifier Rule • But if we want to use this rule for categories that don’t have the AGR feature (such as PPs and APs, in English), we can’t build it into the rule. Instructor: Nick Cercone CSEB -

5 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs The Specifier-Head Agreement Constraint (SHAC) Verbs and nouns must be specified as: Instructor: Nick Cercone CSEB -

6 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs The Count/Mass Distinction • Partially semantically motivated – mass terms tend to refer to undifferentiated substances (air, butter, courtesy, information) – count nouns tend to refer to individuatable entities (bird, cookie, insult, fact) • But there are exceptions: – succotash (mass) denotes a mix of corn & lima beans, so it’s not undifferentiated. – furniture, footwear, cutlery, etc. refer to individuatable artifacts with mass terms – cabbage can be either count or mass, but many speakers get lettuce only as mass. Instructor: Nick Cercone CSEB -

7 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics The Linguist’s stance: Building a precise model • Some statements are statements about how the model works: “[prep] and [AGR 3sing] cannot be combined because AGR is not a feature of the type prep.” • Some statements are statements about how (we think) English or language in general works. “The determiners a and many only occur with count nouns, the determiner much only occurs with mass nouns, and the determiner the occurs with either.” • Some are statements about how we code a particular linguistic fact within the model. “All count nouns are [SPR < [COUNT +]>].” Instructor: Nick Cercone CSEB -

8 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics The Linguist’s stance:A Vista on the Set of Possible English Sentences • ... as a background against which linguistic elements (words, phrases) have a distribution • ... as an arena in which linguistic elements “behave” in certain ways Instructor: Nick Cercone CSEB -

9 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Semantics So far, our “grammar” has no semantic representations. We have, however, been relying on semantic intuitions in our argumentation, and discussing semantic contrasts where they line up (or don't) with syntactic ones. Examples? • structural ambiguity • S/NP parallelism • count/mass distinction • complements vs. modifiers Instructor: Nick Cercone CSEB -

10 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Semantics Aspects of meaning we won’t account for • Pragmatics • Fine-grained lexical semantics: Instructor: Nick Cercone CSEB -

11 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Semantics Our Slice of a World of Meanings “... the linguistic meaning of Chris saved Pat is a proposition that will be true just in case there is an actual situation that involves the saving of someone named Pat by someone named Chris.” Instructor: Nick Cercone CSEB -

12 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Semantics Our Slice of a World of Meanings What we are accounting for is the compositionality of sentence meaning. • How the pieces fit together Semantic arguments and indices • How the meanings of the parts add up to the meaning of the whole. Appending RESTR lists up the tree Instructor: Nick Cercone CSEB -

13 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics in Constraint-based grammar Constraints as generalized truth conditions • proposition: what must be the case for a proposition to be true • directive: what must happen for a directive to be fulfilled • question: the kind of situation the asker is asking about • reference: the kind of entity the speaker is referring to Syntax/semantics interface: Constraints on how syntactic arguments are related to semantic ones, and on how semantic information is compiled from different parts of the sentence. Instructor: Nick Cercone CSEB -

14 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – Feature Geometry Instructor: Nick Cercone CSEB -

15 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – How the pieces fit together Instructor: Nick Cercone CSEB -

16 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – How the pieces fit together Instructor: Nick Cercone CSEB -

17 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – How the pieces fit together Instructor: Nick Cercone CSEB -

18 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics (pieces together) Instructor: Nick Cercone CSEB -

19 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics (more detailed view of same tree) Instructor: Nick Cercone CSEB -

20 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics To Fill in Semantics for the S-node, we need the Semantics Principles The Semantic Inheritance Principle: In any headed phrase, the mother's MODE and INDEX are identical to those of the head daughter. The Semantic Compositionality Principle: In any well-formed phrase structure, the mother's RESTR value is the sum of the RESTR values of the daughter. Instructor: Nick Cercone CSEB -

21 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – semantics inheritance illustrated Instructor: Nick Cercone CSEB -

22 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics - semantic compositionality illustrated Instructor: Nick Cercone CSEB -

23 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – what identifies indices Instructor: Nick Cercone CSEB -

24 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – summary words contribute predications ‘expose’ one index in those predications, for use by words or phrases relate syntactic arguments to semantic arguments Instructor: Nick Cercone CSEB -

25 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – summary, grammar rules identify feature structures (including the INDEX value) across daughters Head Specifier Rule Head Complement Head Modifier Instructor: Nick Cercone CSEB -

26 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – summary, grammar rules identify feature structures (including the INDEX value) across daughters license trees which are subject to the semantic principles - SIP ‘passes up’ MODE and INDEX from head daughter Instructor: Nick Cercone CSEB -

27 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Semantics – summary, grammar rules identify feature structures (including the INDEX value) across daughters license trees which are subject to the semantic principles SIP ‘passes up’ MODE and INDEX from head daughter SCP: ‘gathers up’ predications (RESTR list) from all daughters Instructor: Nick Cercone CSEB -

28 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – other aspects of semantics Tense, Quantification (only touched on here) Modification Coordination Structural Ambiguity Instructor: Nick Cercone CSEB -

29 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – what were are trying to do Objectives • Develop a theory of knowledge of language • Represent linguistic information explicitly enough to distinguish well-formed from ill-formed expressions • Be parsimonious, capturing linguistically significant generalizations. Why Formalize? • To formulate testable predictions • To check for consistency • To make it possible to get a computer to do it for us Instructor: Nick Cercone CSEB -

30 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs –how we construct sentences The Components of Our Grammar • Grammar rules • Lexical entries • Principles • Type hierarchy (very preliminary, so far) • Initial symbol (S, for now) We combine constraints from these components. • Question: What says we have to combine them? Instructor: Nick Cercone CSEB -

31 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – an example A cat slept. • Can we build this with our tools? • Given the constraints our grammar puts on well-formed sentences, is this one? Instructor: Nick Cercone CSEB -

32 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – lexical entry for “a” • Is this a fully specified description? • What features are unspecified? • How many word structures can this entry license? Instructor: Nick Cercone CSEB -

33 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – lexical entry for “cat” • Which feature paths are abbreviated and Is this fully specified? • What features are unspecified? • How many word structures can this entry license? Instructor: Nick Cercone CSEB -

34 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Effect of Principles: the SHAC Instructor: Nick Cercone CSEB -

35 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Description of Word Structures for cat Instructor: Nick Cercone CSEB -

36 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Description of Word Structures for a Instructor: Nick Cercone CSEB -

37 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Building a Phrase Instructor: Nick Cercone CSEB -

38 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Constraints Contributed by Daughter Subtrees Instructor: Nick Cercone CSEB -

39 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Constraints Contributed by the Grammar Rule Instructor: Nick Cercone CSEB -

40 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - A Constraint Involving the SHAC Instructor: Nick Cercone CSEB -

41 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Effects of the Valence Principle Instructor: Nick Cercone CSEB -

42 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Effects of the Head Feature Principle Instructor: Nick Cercone CSEB -

43 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Effects of the Semantic Inheritance Principle Instructor: Nick Cercone CSEB -

44 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Effects of the Semantic Compositionality Principle Instructor: Nick Cercone CSEB -

45 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Is the Mother Node Now Completely Specified? Instructor: Nick Cercone CSEB -

46 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Lexical Entry for slept Instructor: Nick Cercone CSEB -

47 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Another Head-Specifier Phrase Instructor: Nick Cercone CSEB -

48 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Is this description fully specified? Instructor: Nick Cercone CSEB -

49 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Does the top node satisfy the initial symbol? Instructor: Nick Cercone CSEB -

50 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - RESTR of the S node Instructor: Nick Cercone CSEB -

51 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Another example Instructor: Nick Cercone CSEB -

52 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Head Features from Lexical Entries Instructor: Nick Cercone CSEB -

53 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Head Features from Lexical Entries, plus HFP Instructor: Nick Cercone CSEB -

54 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Valence Features:Lexicon, Rules, and the Valence Principle Instructor: Nick Cercone CSEB -

55 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Required Identities: Grammar Rules Instructor: Nick Cercone CSEB -

56 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - Two Semantic Features: the Lexicon & SIP Instructor: Nick Cercone CSEB -

57 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - RESTR Values and the SCP Instructor: Nick Cercone CSEB -

58 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs - An Ungrammatical Example What’s wrong with this sentence? The Valence Principle, Head Specifier Rule Instructor: Nick Cercone CSEB -

59 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
HPSGs – Overview • Information movement in trees • Exercise in critical thinking • SPR and COMPS • Technical details (lexical entries, trees) • Analogies to other systems you might know, e.g., How is the type hierarchy like an ontology? Instructor: Nick Cercone CSEB -

60 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP – Introduction NLP as we have examined thus far can be contrasted with statistical NLP. For example, statistical parsing researchers assue that there is a continuum and that the only distinction to be drawn is between the correct parse and all the rest. The “parse” given by the parse tree on the right would support this continuum view. For statistical NLP researchers, there is no Difference between parsing and syntactic Disambiguation: its parsing all the way! Instructor: Nick Cercone CSEB -

61 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP – Statistical NLP is normally taught in 2 parts: Part I lays out the mathematical and linguistic foundation that the other parts build on. These include concepts and techniques normally referred to throughout the course. Part II covers word-centered work in Statistical NLP. There is a natural progression from simple to complex linguistic phenomena in collocations, n-gram models, word sense disambiguation, and lexical acquisition. This work is followed by techniques such as Markov Models, tagging, probabilistic context free grammars, and probabilistic parsing, which build on each other. Finally other applications and techniques are introduced: statistical alignment and machine translation, clustering, information retrieval, and text categorization. Instructor: Nick Cercone CSEB -

62 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP – What we will discuss 1. Information Retrieval and the Vector Space Model Typical IR system architecture, steps in document and query processing in IR, vector space model, tfidf - term frequency inverse document frequency weights, term weighting formula, cosine similarity measure, term-by-document matrix, reducing the number of dimensions, Latent Semantic Analysis, IR evaluation Instructor: Nick Cercone CSEB -

63 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 2. Text Classification Text classification and text clustering, Types of text classification, evaluation measures in text classification, F-measure, Evaluation methods for classification: general issues - over fitting and under fitting, methods: 1. training error, 2. train and test, 3. n-fold cross-validation Instructor: Nick Cercone CSEB -

64 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 3. Parser Evaluation, Text Clustering and CNG Classification Parser evaluation: PARSEVAL measures, labeled and unlabeled precision and recall, F-measure; Text clustering: task definition, the simple k-means method, hierarchical clustering, divisive and agglomerative clustering; evaluation of clustering: inter-cluster similarity, cluster purity, use of entropy or information gain; CNG -- Common N-Grams classification method Instructor: Nick Cercone CSEB -

65 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 4. Probabilistic Modeling and Joint Distribution Model Elements of probability theory, Generative models, Bayesian inference, Probabilistic modeling: random variables, random configurations, computational tasks in probabilistic modeling, spam detection example, joint distribution model, drawbacks of joint distribution model Instructor: Nick Cercone CSEB -

66 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 5. Fully Independent Model and Naive Bayes Model Fully independent model, example, computational tasks, sum-product formula; Naive Bayes model: motivation, assumption, computational tasks, example, number of parameters, pros and cons; N-gram model, language modeling in speech recognition Instructor: Nick Cercone CSEB -

67 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 6. N-gram Model N-gram model: n-gram model assumption, graphical representation, use of log probabilities; Markov chain: stochastic process, Markov process, Markov chain; Perplexity and evaluation of N-gram models, Text classification using language models Instructor: Nick Cercone CSEB -

68 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 7. Hidden Markov Model Smoothing: Add-one (Laplace) smoothing, Bell-Witten smoothing; Hidden Markov Model, graphical representations, assumption, HMM POS example, Viterbi algorithm -- use of dynamic programming in HMMs. Instructor: Nick Cercone CSEB -

69 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Statistical NLP - – What we will discuss 8. Bayesian Networks Bayesian Networks, definition, example, Evaluation tasks in Bayesian Networks: evaluation, sampling, inference in Bayesian Networks by brute force, general inference in Bayesian Networks is NP-hard, efficient inference in Bayesian Networks, Instructor: Nick Cercone CSEB -

70 Instructor: Nick Cercone - 3050 CSEB - nick@cse.yorku.ca
Other Concluding Remarks ATOMYRIADES Nature, it seems, is the popular name for milliards and milliards and milliards of particles playing their infinite game of billiards and billiards and billiards. Instructor: Nick Cercone CSEB -


Download ppt "Instructor: Nick Cercone CSEB -"

Similar presentations


Ads by Google