Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006.

Similar presentations


Presentation on theme: "Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006."— Presentation transcript:

1 Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006

2 Problem PCFGs can model a more powerful class of languages than HMMs. Can we take advantage of this property? Regular Language Context Free Language Probabilistic Context Free Grammar (PCFG) Hidden Markov Model (HMM) Context Sensitive Language Unrestricted Language

3 PCFG Background S N V (1.0) N Bob (0.3) Jane (0.7) (Probability) V V N (0.4) loves (0.6) Example Grammar: Production Rule: Jane loves Bob. S VN V N Example Parse:

4 PCFG Applications Natural Language Processing: parsing written sentences BioInformatics: RNA sequences Stock Markets: model rise/fall of the Dow Jones (?) Computer Vision: parsing architectural scenes

5 PCFG Application: Architectural Facade Parsing

6 Goal: Inferring 3D Semantic Structure

7 Discrete vs. Continuous Observations Natural Language Processing: parsing written sentences BioInformatics: RNA sequences Stock Markets: model rise/fall of the Dow Jones (?) Discrete Values Continuous Values How do we estimate the parameters of PCFGs with continuous observation densities (terminal nodes in the parse tree)?

8 PCFG Parameter Estimation In the discrete case, there exists an Expectation Maximization (EM) algorithm: E-Step: Compute expected number of times each rule (A-> BC) is used in generating a given set of observation sequences (based on previous parameter estimates). M-Step: Update parameters as normalized counts computed in E-Step. Essentially: P*(N Bob) = #Bobs / #Nouns

9 Gaussian Parameter Update Equations NEW! Probability that rule A was applied to generate the observed value at location i, computed from Inside-Outside Algorithm via CYK Algorithm

10 Significance We can now begin applying probabilistic context-free grammars to problems with continuous data (e.g. stock market) rather than restricting ourselves to discrete outputs (e.g. natural language, RNA). We hope to find problems for which PCFGs offer a better model than HMMs.

11 Questions

12

13 Open Problems How do we estimate the parameters of PCFGs with: A. continuous observation densities (terminal nodes in the parse tree)? B. continuous values for both non-terminal and terminal nodes?

14 CYK Algorithm Inside-Outside Probabilities


Download ppt "Probabilistic Context Free Grammars Grant Schindler 8803-MDM April 27, 2006."

Similar presentations


Ads by Google