Design patterns (?) for control abstraction What do parsers, -calculus reducers, and Prolog interpreters have in common?

Slides:



Advertisements
Similar presentations
Introduction A function is called higher-order if it takes a function as an argument or returns a function as a result. twice :: (a  a)  a  a twice.
Advertisements

Computational Models The exam. Models of computation. –The Turing machine. –The Von Neumann machine. –The calculus. –The predicate calculus. Turing.
Logic Programming – Part 2 Lists Backtracking Optimization (via the cut operator) Meta-Circular Interpreters.
Kathleen Fisher cs242 Reading: “A history of Haskell: Being lazy with class”,A history of Haskell: Being lazy with class Section 6.4 and Section 7 “Monads.
Logic Programming (cont’d): Lists Lists in Prolog are represented by a functor (similar to cons in Scheme): 1.The empty list is represented by the constant.
Lambda Calculus and Lisp PZ03J. Lambda Calculus The lambda calculus is a model for functional programming like Turing machines are models for imperative.
© M. Winter COSC 4P41 – Functional Programming Abstract data types (ADTs) An ADT is a data type together with some functions to manipulate elements.
0 LECTURE 5 LIST COMPREHENSIONS Graham Hutton University of Nottingham.
CS 355 – Programming Languages
Domain Specific Embedded Languages Lecture 2, Designing and Using Combinators John Hughes.
What is a Parser? A parser is a program that analyses a piece of text to determine its syntactic structure  3 means 23+4.
C. Varela; Adapted w/permission from S. Haridi and P. Van Roy1 Declarative Computation Model Defining practical programming languages Carlos Varela RPI.
0 PROGRAMMING IN HASKELL Chapter 7 - Higher-Order Functions.
Higher-Order Functions Koen Lindström Claessen. What is a “Higher Order” Function? A function which takes another function as a parameter. Examples map.
A Lightning Tour of Haskell Lecture 1, Designing and Using Combinators John Hughes.
Introducing Monads Lecture 3, Designing and Using Combinators John Hughes.
Cse536 Functional Programming 1 6/23/2015 Lecture #17, Dec. 1, 2004 Todays Topics – Higher Order types »Type constructors that take types as arguments.
Chapter 1 Problem Solving, Programming, and Calculation.
Type Inference David Walker COS 441. Criticisms of Typed Languages Types overly constrain functions & data polymorphism makes typed constructs useful.
Sparkle A theorem prover for the functional language Clean Maarten de Mol University of Nijmegen February 2002.
Type Inference David Walker CS 510, Fall Criticisms of Typed Languages Types overly constrain functions & data polymorphism makes typed constructs.
Type Inference: CIS Seminar, 11/3/2009 Type inference: Inside the Type Checker. A presentation by: Daniel Tuck.
CPSC 388 – Compiler Design and Construction Parsers – Context Free Grammars.
Functional Programming in Haskell Motivation through Concrete Examples Adapted from Lectures by Simon Thompson.
CSI 3120, Grammars, page 1 Language description methods Major topics in this part of the course: –Syntax and semantics –Grammars –Axiomatic semantics (next.
Patterns in OCaml functions. Formal vs. actual parameters Here's a function definition (in C): –int add (int x, int y) { return x + y; } –x and y are.
Deriving Combinator Implementations Lecture 4, Designing and Using Combinators John Hughes.
Chapter Twenty-ThreeModern Programming Languages1 Formal Semantics.
Formal Semantics Chapter Twenty-ThreeModern Programming Languages, 2nd ed.1.
Lee CSCE 314 TAMU 1 CSCE 314 Programming Languages Haskell: Types and Classes Dr. Hyunyoung Lee.
Functional Programming guest lecture by Tim Sheard Parsing in Haskell Defining Parsing Combinators.
Advanced Functional Programming Tim Sheard 1 Lecture 14 Advanced Functional Programming Tim Sheard Oregon Graduate Institute of Science & Technology Lecture.
0 PROGRAMMING IN HASKELL Chapter 9 - Higher-Order Functions, Functional Parsers.
Advanced Programming Andrew Black and Tim Sheard Lecture 11 Parsing Combinators.
Overview of the Haskell 98 Programming Language
© M. Winter COSC 4P41 – Functional Programming Programming with actions Why is I/O an issue? I/O is a kind of side-effect. Example: Suppose there.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
Lee CSCE 314 TAMU 1 CSCE 314 Programming Languages Haskell: Higher-order Functions Dr. Hyunyoung Lee.
A Monadic-Memoized Solution for Left-Recursion Problem of Combinatory Parser Rahmatullah Hafiz Fall, 2005.
CSE Winter 2008 Introduction to Program Verification January 15 tautology checking.
1 CS 457/557: Functional Languages Lists and Algebraic Datatypes Mark P Jones Portland State University.
Lee CSCE 314 TAMU 1 CSCE 314 Programming Languages Interactive Programs: I/O and Monads Dr. Hyunyoung Lee.
1 A Simple Syntax-Directed Translator CS308 Compiler Theory.
Workshop: Towards Highly Portable Software Jakarta, 21 – 23 January 2003 Diselenggarakan oleh Universitas IndonesiaUniversitas Indonesia Part 1 : Programming.
0 PROGRAMMING IN HASKELL Based on lecture notes by Graham Hutton The book “Learn You a Haskell for Great Good” (and a few other sources) Odds and Ends,
CMSC 330: Organization of Programming Languages Operational Semantics.
Recursion Higher Order Functions CSCE 314 Spring 2016.
What is a Parser? A parser is a program that analyses a piece of text to determine its syntactic structure  3 means 23+4.
Announcements Today: the Parsing Domain-specific Language
Koen Lindström Claessen
Parsing & Context-Free Grammars
PROGRAMMING IN HASKELL
PROGRAMMING IN HASKELL
Higher-Order Functions
PROGRAMMING IN HASKELL
PROGRAMMING IN HASKELL
PROGRAMMING IN HASKELL
PROGRAMMING IN HASKELL
The Metacircular Evaluator
Lecture 23 Pages : Separating Syntactic Analysis from Execution. We omit many details so you have to read the section in the book. The halting.
PROGRAMMING IN HASKELL
Domain-specific languages and functional programming
Types and Classes in Haskell
Higher Order Functions
PROGRAMMING IN HASKELL
CSCE 314: Programming Languages Dr. Dylan Shell
PROGRAMMING IN HASKELL
Programming Languages
CSCE 314: Programming Languages Dr. Dylan Shell
PROGRAMMING IN HASKELL
Presentation transcript:

Design patterns (?) for control abstraction What do parsers, -calculus reducers, and Prolog interpreters have in common?

What’s it all about? If you’ve been anywhere near functional programmers during the last decade, you’ll have heard a lot about parser combinators, monads, monadic parser combinators, domain-specific embedded languages (DSEL),.. There are a lot more details to these, but the common theme are libraries of control abstractions, built up from higher-order functions We’ll look at a few examples and useful ideas that are not quite as well-known as they could be

Control structures Language designers and program verifiers are used to thinking in terms of program calculi, focussing on essential structure, e.g., basic operations and their units: –Sequential composition of actions/no action –Alternative composition of choices/no choice –[Parallel composition of processes/no process] [not for today] Concrete languages come with their own complex, built-in control structures (historical design) –can be mapped to and understood as combinations of basic operations, but have grown into fixed forms which may not be a good match for the problem at hand

User-defined control structures Languages in which control structures are first- class objects (higher order functions/procedures) make it easy to “roll your own” control structures –OOP: modelling of real-world objects –FP: modelling of real-world control structures? Design freedom needs guidance – try to identify: –domain-specific control structures –general-purpose control structures (sequence, alternative, parallels, recursion,..) Reversed mapping (purpose-built designs) –build libraries of complex, domain-specific structures from basic, general-purpose control structures

“the” example: parser combinators Old idea (e.g., Wadler 1985): –assume an operation for each BNF construct (literals, sequential/alternative composition,..) –define what each construct does in terms of parsing –translate your grammar into a program using these constructs (almost literal translation)  you’ve got a parser for the grammar! Philip Wadler, “How to Replace Failure by a List of Successes”, FPLCA’85, Springer LNCS 201

“Old-style” parser combinators lit x (x’:xs) | x==x’ = [(x,xs)] lit x _ = [] empty v xs = [(v,xs)] fail xs = [] alt p q xs = (p xs)++(q xs) seq f p q xs = [ (f v1 v2,xs2) | (v1,xs1) <- p xs, (v2,xs2) <- q xs1 ] rep p = alt (seq (:) p (rep p)) (empty []) rep1 p = seq cons p (rep p) alts ps = foldr alt fail ps seqs ps = foldr (seq (:)) (empty []) ps lits xs = seqs [ lit x | x<-xs ] type Parser v = String -> [(v,String)]

“the” example, continued A grammar/parser for arithmetic expressions: expr = alts [number, seqs [lits “(”, expr, op, expr, lits “)”]] number = rep1 digit op = alts [lits “+”, lits “-”, lits “*”, lits “/”] digit = alts [lits (show n) | n <- [0..9]] Useful observations: –only the literals really “do” any parsing – the combinators form a coordination layer on top of that, organising the application of literal parsers to the input –the parsing is domain-specific, the coordination is not Modern variants tend to use monads for the coordination layer (e.g., Hutton and Meijer 1996)

From Hugs’ ParseLib.hs newtype Parser a = P {papply :: (String -> [(a,String)])} instance Monad Parser where -- return :: a -> Parser a return v = P (\inp -> [(v,inp)]) -- >>= :: Parser a -> (a -> Parser b) -> Parser b (P p) >>= f = P (\inp -> concat [ papply (f v) out | (v,out) <- p inp]) instance MonadPlus Parser where -- mzero :: Parser a mzero = P (\inp -> []) -- mplus :: Parser a -> Parser a -> Parser a (P p) `mplus` (P q) = P (\inp -> (p inp ++ q inp))

From Hugs’ ParseLib.hs item :: Parser Char item = P (\inp -> case inp of [] -> [] (x:xs) -> [(x,xs)]) sat :: (Char -> Bool) -> Parser Char sat p = do { x <- item ; if p x then return x else mzero } bracket :: Parser a -> Parser b -> Parser c -> Parser b bracket open p close = do {open; x <- p; close; return x} … char, digit, letter,.., many, many1, sepby, sepby1, … Item needs to inspect inp!

Grammar combinators? We can write the coordination layer to be independent of the particular task (parsing) –Then we can plug in different basic actions instead of literal parsers, to get different grammar-like programs Instead of just parser combinators, we get a general form of control combinators, applicable to all tasks with grammar-like specifications –obvious examples: generating language strings, unparsing (from AST to text), pretty-printing, … –Less obvious: syntax-directed editing, typing (?), reduction strategies (think contexts and context- sensitive rules), automated reasoning strategies,…

That monad thing..(I) Parsers transform Strings to produce ASTs, unparsers transform ASTs to produce Strings, editors and reducers transform ASTs,.. –Generalise to state transformers Combinators for sequence, alternative, etc. are so common that we will use them often –Make them so general that one set of definitions works for all applications?  one size fits all? –Overload one set of combinators with application- specific definitions?  do the variants still have anything in common? –A mixture of both: capture the commonalities, enable specialisation (a framework). Monad, MonadPlus

That monad thing..(II) Type constructors: –data [a] = [] | (a:[a]) –data Maybe a = Nothing | Just a –data ST m s a = s -> m (a,s) Type constructor classes: for type constructor m, –an instance of Monad m defines sequential composition ( >>= ) and its unit ( return ) –an instance of MonadPlus m defines alternative composition ( mplus ) and its unit ( mzero ) (over things of type m a)

Monad class Monad m where return :: a -> m a (>>=) :: m a -> (a -> m b) -> m b (>>) :: m a -> m b -> m b fail :: String -> m a -- Minimal complete definition: (>>=), return p >> q = p >>= \ _ -> q fail s = error s -- some Instances of Monad instance Monad Maybe where Just x >>= k = k x Nothing >>= k = Nothing return = Just fail s = Nothing instance Monad [ ] where (x:xs) >>= f = f x ++ (xs >>= f) [] >>= f = [] return x = [x] fail s = []

MonadPlus class Monad m => MonadPlus m where mzero :: m a mplus :: m a -> m a -> m a -- some Instances of MonadPlus instance MonadPlus Maybe where mzero = Nothing Nothing `mplus` ys = ys xs `mplus` ys = xs instance MonadPlus [ ] where mzero = [] mplus = (++)

State transformer monad newtype ST m s a = ST {unST :: s -> m (a,s)} instance Monad m => Monad (ST m s) where -- return :: a -> ST m s a return v = ST (\inp -> return (v,inp)) -- >>= :: ST m s a -> (a -> ST m s b) -> ST m s b (ST p) >>= f = ST (\inp -> do {(v,out) <- p inp ; unST (f v) out }) instance MonadPlus m => MonadPlus (ST m s) where -- mzero :: ST m s a mzero = ST (\inp -> mzero) -- mplus :: ST m s a -> ST m s a -> ST m s a (ST p) `mplus` (ST q) = ST (\inp -> (p inp `mplus` q inp))

State transformer monad newtype ST m s a = ST {unST :: s -> m (a,s)} instance Monad m => Monad (ST m s) where -- return :: a -> ST m s a return v = ST (\inp -> return (v,inp)) -- >>= :: ST m s a -> (a -> ST m s b) -> ST m s b (ST p) >>= f = ST (\inp -> do {(v,out) <- p inp ; unST (f v) out }) instance MonadPlus m => MonadPlus (ST m s) where -- mzero :: ST m s a mzero = ST (\inp -> mzero) -- mplus :: ST m s a -> ST m s a -> ST m s a (ST p) `mplus` (ST q) = ST (\inp -> (p inp `mplus` q inp))

State transformer monad newtype ST m s a = ST {unST :: s -> m (a,s)} instance Monad m => Monad (ST m s) where -- return :: a -> ST m s a return v = ST (\inp -> return (v,inp)) -- >>= :: ST m s a -> (a -> ST m s b) -> ST m s b (ST p) >>= f = ST (\inp -> do {(v,out) <- p inp ; unST (f v) out }) instance MonadPlus m => MonadPlus (ST m s) where -- mzero :: ST m s a mzero = ST (\inp -> mzero) -- mplus :: ST m s a -> ST m s a -> ST m s a (ST p) `mplus` (ST q) = ST (\inp -> (p inp `mplus` q inp))

Parsing, again -- newtype ST m s a = ST {unST :: s -> m (a,s)} type Parser a = ST [] String a -- combinators for free -- basic parsers still needed litP :: (Char -> Bool) -> Parser Char litP p = ST (\inp -> case dropWhile isSpace inp of { (x:xs) | p x -> [(x,xs)] ; otherwise -> [] }) lit :: Char -> Parser Char lit c = litP (==c) -- as well as auxiliary combinations…

AS/Parser/Grammar for data Exp = Var String | App Exp Exp | Lam String Exp deriving Show exp = var `mplus` app `mplus` abs var = do { v <- litP isAlpha ; return $ Var [v] } app = do { lit '(' ; e1 <- exp ; e2 <- exp ; lit ')' ; return $ App e1 e2 } abs = do { lit '\\' ; Var v <- var ; lit '.' ; e <- exp ; return $ Lam v e }

What about semantics/reduction? -calculus reduction semantics: –( v.M) N   M[v  N] {context-free reduction; meant to be valid in all contexts} refined by a reduction strategy (limited contexts): –C nor [ ( v.M) N ]  ,nor C nor [ M[v  N] ] {context-sensitive, normal-order reduction} –C nor []  [] | (C nor [] ) {reduction contexts; expressions with a hole }

Translation to Haskell  -reduction in Haskell: beta (App (Lam v m) n) = return $ substitute v n m beta _ = fail "not a redex“ Normal-order reduction strategy in Haskell: norStep m n) = beta e `mplus` (norStep m >>= (\m'-> return (App m' n))) norStep _ = fail "not an application” nor e = (norStep e >>= nor) `mplus` (return e)

And now, for something completely.... different? Embedding Prolog in Haskell

Prolog, by example programming in predicate logic: –define predicates via facts and rules –find solutions to queries about your "knowledge base": app([],Y,Y). app([X | XS],Y,[X | ZS]):-app(XS,Y,ZS). ?- app(X,Y,[1,2]). X=[], Y=[1,2]; X=[1], Y=[2]; X=[1,2], Y=[]; no

Prolog, by example Where's the logic? assume (  Y:app([],Y,Y)  true)  (  X,XS,Y,ZS: app([X|XS],Y,[X|ZS])  app(XS,Y,ZS) ) then  X,Y: app(X,Y,[1,2]) proof X=[]  Y=[1,2]  X=[1]  Y=[2]  X=[1,2]  Y=[]

Prolog, de-sugared Closed-world assumption and de-sugaring:  A,B,C: App(A,B,C)  (A=[]  B=C)   X,XS,Y,ZS: (A=[X|XS]  C=[X|ZS]  app(XS,Y,ZS)) Need equivalence rather than implication, as well as explicit unification and existential quantification, but now we're ready to go

Prolog, embedded in Haskell Embedding takes little more than a page of code (mostly, unification). The rest is our good old friends, state transformer monads:  : Sequential composition true : return ()  : Alternative composition false : mzero  : = (function definition) Predicates: substitution transformers Unification: explicit code  v : fresh variables

Prolog, embedded in Haskell app a b c = (do { a === Nil ; b === c }) +++ (exists "" $ \x-> exists "" $ \xs-> exists "" $ \zs-> do { a === (x:::xs) ; c === (x:::zs) ; app xs b zs }) x2 = exists "x" $ \x-> exists "y" $ \y-> app x y (Atom "1":::Atom "2":::Nil) Prolog> solve x2 y_1=1:::2:::[] x_0=[] y_1=2:::[] x_0=1:::[] y_1=[] x_0=1:::2:::[]

-- imports data Term = Var String | Atom String | Nil | Term:::Term type Subst = [(String,Term)]..showSubst s =..simplify s =.. data State = State { subst :: Subst, free :: Integer } type Predicate = ST [] State fresh :: String -> Predicate Term fresh n = ST $ \s-> return (Var (n++"_"++show (free s)),s{free=free s+1}) -- unification: substitution transformer (===) :: Term -> Term -> Predicate () true,false :: Predicate () true = return () false = mzero exists :: String -> (String -> Predicate a) -> Predicate a exists n p = do { v <- fresh n ; p v } solve x = mapM_ (putStrLn.showSubst) [ subst $ simplify s | (_,s) <- unST x (State [] 0) ]

-- examples app a b c = (do { a === Nil ; b === c }) +++ (exists "" $ \x-> exists "" $ \xs-> exists "" $ \ys-> do { a === (x:::xs) ; c === (x:::ys) ; app xs b ys }) x0 = exists "x" $ \x-> app (Atom "1":::x) Nil (Atom "1":::Atom "2":::Nil) x1 = exists "z" $ \z-> app (Atom "1":::Atom "2":::Nil) Nil z x2 = exists "x" $ \x-> exists "y" $ \y-> app x y (Atom "1":::Atom "2":::Nil)

Summary Parser combinators are only one of many examples of a programming pattern with a grammar-like coordination language (Wadler'85 already suggested tacticals as another example; there has been some recent work on rewriting strategy combinators, e.g., for compiler optimisations and other program transformations) Monad, MonadPlus,state transformers, do notation facilitate reuse in this pattern Both transformer/containers and plain containers (Maybe,[],Trees,..) fit the pattern Coordination and computation can be defined separately to enhance modularity