Cs3102: Theory of Computation Class 7: Context-Free Languages Spring 2010 University of Virginia David Evans.

Slides:



Advertisements
Similar presentations
Lecture 10: Context-Free Languages Contextually David Evans
Advertisements

David Evans cs302: Theory of Computation University of Virginia Computer Science Lecture 13: Turing Machines.
CS 345: Chapter 9 Algorithmic Universality and Its Robustness
Pushdown Automata Chapter 12. Recognizing Context-Free Languages We need a device similar to an FSM except that it needs more power. The insight: Precisely.
ICE1341 Programming Languages Spring 2005 Lecture #5 Lecture #5 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
ICE1341 Programming Languages Spring 2005 Lecture #4 Lecture #4 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
NFAs Sipser 1.2 (pages 47–54). CS 311 Fall Recall… Last time we showed that the class of regular languages is closed under: –Complement –Union.
1 Introduction to Computability Theory Lecture12: Decidable Languages Prof. Amos Israeli.
Introduction to Computability Theory
CS5371 Theory of Computation
Theory of Computation What types of things are computable? How can we demonstrate what things are computable?
Costas Busch - RPI1 NPDAs Accept Context-Free Languages.
Courtesy Costas Busch - RPI1 NPDAs Accept Context-Free Languages.
1 Converting NPDAs to Context-Free Grammars. 2 For any NPDA we will construct a context-free grammar with.
1 Normal Forms for Context-free Grammars. 2 Chomsky Normal Form All productions have form: variable and terminal.
Fall 2004COMP 3351 NPDA’s Accept Context-Free Languages.
Foundations of (Theoretical) Computer Science Chapter 2 Lecture Notes (Section 2.2: Pushdown Automata) Prof. Karen Daniels, Fall 2009 with acknowledgement.
January 14, 2015CS21 Lecture 51 CS21 Decidability and Tractability Lecture 5 January 14, 2015.
Linear Bounded Automata LBAs
CS 490: Automata and Language Theory Daniel Firpo Spring 2003.
Normal forms for Context-Free Grammars
CS5371 Theory of Computation Lecture 8: Automata Theory VI (PDA, PDA = CFG)
January 23, 2015CS21 Lecture 81 CS21 Decidability and Tractability Lecture 8 January 23, 2015.
A shorted version from: Anastasia Berdnikova & Denis Miretskiy.
PZ03A Programming Language design and Implementation -4th Edition Copyright©Prentice Hall, PZ03A - Pushdown automata Programming Language Design.
1 A Non Context-Free Language (We will prove it at the next class)
Nathan Brunelle Department of Computer Science University of Virginia Theory of Computation CS3102 – Spring 2014 A tale.
Final Exam Review Cummulative Chapters 0, 1, 2, 3, 4, 5 and 7.
Lecture 14: Church-Turing Thesis Alonzo Church ( ) Alan Turing ( )
Cs3102: Theory of Computation Class 10: DFAs in Practice Spring 2010 University of Virginia David Evans.
Pushdown Automata.
Pushdown Automata CS 130: Theory of Computation HMU textbook, Chap 6.
Pushdown Automata (PDAs)
Cs3102: Theory of Computation Class 6: Pushdown Automata Spring 2010 University of Virginia David Evans TexPoint fonts used in EMF. Read the TexPoint manual.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
Cs3102: Theory of Computation Class 14: Turing Machines Spring 2010 University of Virginia David Evans.
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 11 Midterm Exam 2 -Context-Free Languages Mälardalen University 2005.
Pushdown Automata Chapters Generators vs. Recognizers For Regular Languages: –regular expressions are generators –FAs are recognizers For Context-free.
Cs3102: Theory of Computation Class 8: Non-Context-Free Languages Spring 2010 University of Virginia David Evans.
9.7: Chomsky Hierarchy.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
PZ03A Programming Language design and Implementation -4th Edition Copyright©Prentice Hall, PZ03A - Pushdown automata Programming Language Design.
PushDown Automata. What is a stack? A stack is a Last In First Out data structure where I only have access to the last element inserted in the stack.
Foundations of (Theoretical) Computer Science Chapter 2 Lecture Notes (Section 2.2: Pushdown Automata) Prof. Karen Daniels, Fall 2010 with acknowledgement.
1 CD5560 FABER Formal Languages, Automata and Models of Computation Lecture 9 Mälardalen University 2006.
CSCI 2670 Introduction to Theory of Computing October 13, 2005.
Donghyun (David) Kim Department of Mathematics and Physics North Carolina Central University 1 Chapter 2 Context-Free Languages Some slides are in courtesy.
LANGUAGE AND REASONING IN HUMANS AND OTHER ANIMALS Tatiana Chernigovskaya St. Petersburg State University Part VI.
FORMAL LANGUAGES, AUTOMATA, AND COMPUTABILITY
1 Chapter Pushdown Automata. 2 Section 12.2 Pushdown Automata A pushdown automaton (PDA) is a finite automaton with a stack that has stack operations.
CS 154 Formal Languages and Computability March 15 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
CS 154 Formal Languages and Computability May 12 Class Meeting Department of Computer Science San Jose State University Spring 2016 Instructor: Ron Mak.
Week 14 - Friday.  What did we talk about last time?  Simplifying FSAs  Quotient automata.
CS6800 Advance Theory of Computation Spring 2016 Nasser Alsaedi
CSE 105 theory of computation
Linear Bounded Automata LBAs
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
CSE 105 theory of computation
Theory of Computation Lecture #27-28.
Nondeterministic Finite Automata
PZ03A - Pushdown automata
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Principles of Computing – UFCFA3-30-1
CSE 105 theory of computation
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Pushdown automata Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
CSE 105 theory of computation
Presentation transcript:

cs3102: Theory of Computation Class 7: Context-Free Languages Spring 2010 University of Virginia David Evans

Menu Nondeterministic PDAs Reviewing Machine Models of Computing Linguistic Models of Computing

DPDA Recap: DFA/ ε + Stack ε ε q1q1 q2q2 a, ε  + b, +  ε q0q0 ε, ε  $ q3q3 ε, $  ε Processing: aabb State Stack q0q0 q0q0 Input aabb $ $ q1q1 q1q1 +$+$ +$+$ q1q1 q1q1 ++$ q1q1 q1q1 aabb +$+$ +$+$ q2q2 q2q2 $ $ q2q2 q2q2 ε ε q3q3 q3q3 ε, $  ε

Adding Nondeterminism DFA NFA a a a Regular Languages Configuration: one state Configuration: set of states What does it mean to add nondeterminism to a DPDA?

Adding Nondeterminism to DPDA DPDA NPDA a, h p  h t Languages recognized: ? Configuration: one state + one stack a, h p  h t

Adding Nondeterminism to DPDA DPDA NPDA a, h p  h t Languages recognized: ? Configuration: one state + one stack a, h p  h t Languages recognized: ? + ? Configuration: set of pairs

Example q1q1 q2q2 a, ε  A ε, ε  ε q0q0 a, A  ε ε, ε  $ q3q3 ε, $  ε b, ε  B b, B  ε Now the  -transition is optional: can be multiple possible edges for a state on a (a, h) input.

Acceptance: NPDA accepts w when: Accepting State Model Empty Stack Model Is the set of languages accepted by NPDAs with each model the same?

L(NPDA/Empty Stack)  L(NPDA/Accepting)

q3q3 q8q8 qxqx q stack Cleaner ε, ε  ε

L(NPDA/Accepting)  L(NPDA/Empty Stack)

q0q0 qjqj qkqk qzqz q+q+ ε, ε  $ qAqA ε, $  ε

Open (for us) Questions L(DPDA/Accepting) =? L(DPDA/Empty Stack) Why don’t the proofs for NPDAs work for DPDAs? Are NPDAs more powerful than DPDAs? (will answer next week) What languages cannot be recognized by an NDPDA? (will answer next week) Instead of answering these now, we’ll introduce a different model and show it is equivalent to NPDA

Machine Models ababbaabbaba flickr cc: markhillary Yes! “Kleene Machine”

Stephen Kleene* ( ) “Kleeneliness is next to Gödeliness”

Modeling Human Intellect Turing Machine (Alan Turing, 1936) Modeling Human Computers Finite Automata McCulloch and Pitts, A logical calculus of the ideas immanent in nervous activity, 1943 S. C. Kleene, Representation of Events in Nerve Nets and Finite Automata, 1956 Claude Shannon and John McCarthy, Automata Studies, 1956

Our theoretical objective is not dependent on the assumptions fitting exactly. It is a familiar strategem of science, when faced with a body of data too complex to be mastered as a whole, to select some limited domain of experiences, some simple situations, and to undertake to construct a model to fit these at least approximately. Having set up such a model, the next step is to seek a thorough understanding of the model itself. S. C. Kleene, Representation of Events in Nerve Nets and Finite Automata, 1956

Noam Chomsky

I don’t know anybody who’s ever read a Chomsky book, He does not write page turners, he writes page stoppers. There are a lot of bent pages in Noam Chomsky’s books, and they are usually at about Page 16. Alan Dershowitz

“I must admit to taking a copy of Noam Chomsky’s Syntactic Structures along with me on my honeymoon in During odd moments, while crossing the Atlantic in an ocean liner and while camping in Europe, I read that book rather thoroughly and tried to answer some basic theoretical questions. Here was a marvelous thing: a mathematical theory of language in which I could use a computer programmer’s intuition! The mathematical, linguistic, and algorithmic parts of my life had previously been totally separate. During the ensuing years those three aspects became steadily more intertwined; and by the end of the 1960s I found myself a Professor of Computer Science at Stanford University, primarily because of work that I had done with respect to languages for computer programming.” Donald Knuth

Modeling Language Generative Grammar match  replacement

Modeling Language S  NP VP NP  N’ N’  AdjP N’ AdjP  Adj’ Adj’  Adj N’  N VP  V’ V’  V’ AdvP V’  V AdvP  Adv’ Adv’  Adv N  ideas V  sleep Adj  Colorless Adj  green Adv  furiously

Generative Grammars S  NP VP NP  Adj N VP  V Adv Adj  Colorless Adj  green N  ideas V  sleep Adv  furiously How many sentences can S produce? S’  NP VP NP  Adj NP VP  V Adv Adj  Colorless Adj  green N  ideas V  sleep Adv  furiously How many sentences can S’ produce? S’’  NP VP NP  Adj NP NP  N VP  V Adv Adj  Colorless Adj  green N  ideas V  sleep Adv  furiously How many sentences can S’’ produce?

Recursion  Human ? We hypothesize that faculty of language in the narrow sense (FLN) only includes recursion and is the only uniquely human component of the faculty of language. We further argue that FLN may have evolved for reasons other than language, hence comparative studies might look for evidence of such computations outside of the domain of communication (for example, number, navigation, and social relations). Marc Hauser, Noam Chomsky, Tecumseh Fitch, The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?, Science, Nov 2002 Steven Pinker and Ray Jackendoff (2004): its not just recursion...

Kanzi and Sue Savage-Rumbaugh

Kanzi’s Language

Languages Modeling Computing Machines: String is in the language if machine accepts that input string. Power of a machine type is defined by the set of languages it can recognize. Generative grammars: String is in the language if grammar can produce that input string. Power of a grammar type is defined by the set of languages it can recognize.

s All Languages Regular Languages Can be recognized by some DFA Finite Languages DPDA Languages NPDA Languages Can we define types of grammars that correspond to each class?

Finite Languages Machine: simple lookup table DFA with no cycles Grammar: grammar with no cycles A  terminals

Regular Languages Regular Languages Finite Languages Machine: DFA Grammar: Regular Grammar A  a B A  a Hint: PS2, Problem 9

L (DFA)  L (Regular Grammar)

Context-Free Grammars A  BCD Replacement: any sequence of terminals and nonterminals Match: one nonterminal Can a CFG generate the language a n b n ?

{ w | w contains more a s than b s }

Regular Languages Finite Languages NPDA Languages Machine: NPDA Grammar: Context-Free Grammar A  BCD Context-Free Languages Left side: one nonterminal Replacement: any sequence of terminals and nonterminals

L (NDPDA)  L (CFG) Detailed Proof: Sipser, Section L (NDPDA)  L (CFG)

L (NDPDA)  L (CFG) Detailed Proof: Sipser, Section L (CFG)  L (NDPDA)

More Powerful Grammars s Context-Free Languages Context-Free Grammar A  BCD A  a Context-Sensitive Grammar XAY  XBCDY Unrestricted Grammar XAY  BCD

Recap Questions How can you prove a grammar is context-free? How can you prove a language is context-free? How can you prove a language is not context- free? How can you prove a grammar is regular? How can you prove a language is regular? How can you prove a language is not regular?

Charge PS2 is due Tuesday Human Languages – Are they finite, regular, context-free, context- sensitive? (Is the human brain a DFA, PDA, etc. or something entirely different?) Next week: – Non-Context-Free Languages – Parsing, Applications of Grammars