Languages and Compilers (SProg og Oversættere)

Slides:



Advertisements
Similar presentations
1 Languages and Compilers (SProg og Oversættere) Code Generation.
Advertisements

Programming Languages Marjan Sirjani 2 2. Language Design Issues Design to Run efficiently : early languages Easy to write correctly : new languages.
1 Pass Compiler 1. 1.Introduction 1.1 Types of compilers 2.Stages of 1 Pass Compiler 2.1 Lexical analysis 2.2. syntactical analyzer 2.3. Code generation.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 330 Programming Language Structures Ch.2: Syntax and Semantics Fall 2005.
CS 330 Programming Languages 10 / 16 / 2008 Instructor: Michael Eckmann.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 531 Compiler Construction Ch.1 Spring 2010 Marco Valtorta
BİL744 Derleyici Gerçekleştirimi (Compiler Design)1.
1.3 Executing Programs. How is Computer Code Transformed into an Executable? Interpreters Compilers Hybrid systems.
CSC 8310 Programming Languages Meeting 2 September 2/3, 2014.
1 Languages and Compilers (SProg og Oversættere) Parsing.
Compilation (Chapter 3) 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture.
Syntax Analysis (Chapter 4) 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture.
Unit-1 Introduction Prepared by: Prof. Harish I Rathod
Introduction (Chapter 1) 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture.
1 Languages and Compilers (SProg og Oversættere) Bent Thomsen Department of Computer Science Aalborg University With acknowledgement to Norm Hutchinson.
1 Languages and Compilers (SProg og Oversættere) Lexical analysis.
Languages and Compilers (SProg og Oversættere) Bent Thomsen Department of Computer Science Aalborg University.
Languages and Compilers (SProg og Oversættere)
Runtime Organization (Chapter 6) 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture.
Language Translation A programming language processor is any system that manipulates programs expressed in a PL A source program in some source language.
Contextual Analysis (Chapter 5) 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture.
Introduction to Compiling
Review (Chapter 9) 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture of.
1 Languages and Compilers (SProg og Oversættere) Bent Thomsen Department of Computer Science Aalborg University With acknowledgement to Wei-Tek Tsai who’s.
 Fall Chart 2  Translators and Compilers  Textbook o Programming Language Processors in Java, Authors: David A. Watts & Deryck F. Brown, 2000,
Overview of Compilation Prepared by Manuel E. Bermúdez, Ph.D. Associate Professor University of Florida Programming Language Principles Lecture 2.
1 Languages and Compilers (SProg og Oversættere) Semantic Analysis.
CS416 Compiler Design1. 2 Course Information Instructor : Dr. Ilyas Cicekli –Office: EA504, –Phone: , – Course Web.
Chapter 1 Introduction Samuel College of Computer Science & Technology Harbin Engineering University.
Comp 411 Principles of Programming Languages Lecture 3 Parsing
Announcements/Reading
Chapter 3 – Describing Syntax
Compiler Design (40-414) Main Text Book:
Introduction Chapter : Introduction.
Chapter 1 Introduction.
Introduction to Compiler Construction
A Simple Syntax-Directed Translator
Programming Languages Translator
Chapter 2 :: Programming Language Syntax
Introduction to Parsing (adapted from CS 164 at Berkeley)
Compiler Construction (CS-636)
Introduction of Programming Languages
Chapter 1 Introduction.
课程名 编译原理 Compiling Techniques
Compiler Lecture 1 CS510.
Languages and Compilers (SProg og Oversættere)
CMPE 152: Compiler Design December 5 Class Meeting
CS416 Compiler Design lec00-outline September 19, 2018
Syntax Analysis Sections :.
Lexical and Syntax Analysis
Introduction CI612 Compiler Design CI612 Compiler Design.
CPSC 388 – Compiler Design and Construction
CSE 3302 Programming Languages
CSE401 Introduction to Compiler Construction
Lecture 7: Introduction to Parsing (Syntax Analysis)
R.Rajkumar Asst.Professor CSE
CSCE 330 Programming Language Structures Ch.2: Syntax and Semantics
CS 3304 Comparative Languages
CS416 Compiler Design lec00-outline February 23, 2019
Chapter 2 :: Programming Language Syntax
Course Overview PART I: overview material PART II: inside a compiler
Chapter 2 :: Programming Language Syntax
Course Overview PART I: overview material PART II: inside a compiler
Chapter 10: Compilers and Language Translation
Compilers Principles, Techniques, & Tools Taught by Jing Zhang
Lec00-outline May 18, 2019 Compiler Design CS416 Compiler Design.
Introduction Chapter : Introduction.
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Course Overview PART I: overview material PART II: inside a compiler
Compiler design Review COMP 442/6421 – Compiler Design
Presentation transcript:

Languages and Compilers (SProg og Oversættere) Bent Thomsen Department of Computer Science Aalborg University With acknowledgement to Norm Hutchinson whose slides this lecture is based on.

Curricula (Studie ordning) The purpose of the course is for the student to gain knowledge of important principles in programming languages and for the student to gain an understanding of techniques for describing and compiling programming languages.

What was this course about? Programming Language Design Concepts and Paradigms Ideas and philosophy Syntax and Semantics Compiler Construction Tools and Techniques Implementations The nuts and bolts

The principal paradigms Imperative Programming (C) Object-Oriented Programming (C++) Logic/Declarative Programming (Prolog) Functional/Applicative Programming (Lisp) New paradigms? Agent Oriented Programming Business Process Oriented (Web computing) Grid Oriented

Criteria in a good language design Writability: The quality of a language that enables a programmer to use it to express a computation clearly, correctly, concisely, and quickly. Readability: The quality of a language that enables a programmer to understand and comprehend the nature of a computation easily and accurately. Orthogonality: The quality of a language that features provided have as few restrictions as possible and be combinable in any meaningful way. Reliability: The quality of a language that assures a program will not behave in unexpected or disastrous ways during execution. Maintainability: The quality of a language that eases errors can be found and corrected and new features added.

Criteria (Continued) Generality: The quality of a language that avoids special cases in the availability or use of constructs and by combining closely related constructs into a single more general one. Uniformity: The quality of a language that similar features should look similar and behave similar. Extensibility: The quality of a language that provides some general mechanism for the user to add new constructs to a language. Standardability: The quality of a language that allows programs written to be transported from one computer to another without significant change in language structure. Implementability: The quality of a language that provides a translator or interpreter can be written. This can address to complexity of the language definition.

Important! Syntax is the visible part of a programming language Programming Language designers can waste a lot of time discussing unimportant details of syntax The language paradigm is the next most visible part The choice of paradigm, and therefore language, depends on how humans best think about the problem There are no right models of computations – just different models of computations, some more suited for certain classes of problems than others The most invisible part is the language semantics Clear semantics usually leads to simple and efficient implementations

Levels of Programming Languages High-level program class Triangle { ... float surface() return b*h/2; } Low-level program LOAD r1,b LOAD r2,h MUL r1,r2 DIV r1,#2 RET Executable Machine code 0001001001000101001001001110110010101101001...

implementation language Terminology Q: Which programming languages play a role in this picture? input Translator output source program is expressed in the implementation language object program is expressed in the source language is expressed in the target language A: All of them!

Tombstone Diagrams What are they? diagrams consisting out of a set of “puzzle pieces” we can use to reason about language processors and programs different kinds of pieces combination rules (not all diagrams are “well formed”) Program P implemented in L L P S -> T L Translator implemented in L M Machine implemented in hardware M L Language interpreter in L

Syntax Specification Syntax is specified using “Context Free Grammars”: A finite set of terminal symbols A finite set of non-terminal symbols A start symbol A finite set of production rules Usually CFG are written in “Bachus Naur Form” or BNF notation. A production rule in BNF notation is written as: N ::= a where N is a non terminal and a a sequence of terminals and non-terminals N ::= a | b | ... is an abbreviation for several rules with N as left-hand side.

Concrete and Abstract Syntax The previous grammar specified the concrete syntax of mini triangle. The concrete syntax is important for the programmer who needs to know exactly how to write syntactically well-formed programs. The abstract syntax omits irrelevant syntactic details and only specifies the essential structure of programs. Example: different concrete syntaxes for an assignment v := e (set! v e) e -> v v = e

Abstract Syntax Trees Abstract Syntax Tree for: d:=d+10*n d d + 10 n * AssignmentCmd BinaryExpression BinaryExpression VName VNameExp IntegerExp VNameExp SimpleVName SimpleVName SimpleVName Ident Ident Op Int-Lit Ident Op d d + 10 n *

Contextual Constraints Syntax rules alone are not enough to specify the format of well-formed programs. Example 1: let const m~2 in m + x Undefined! Scope Rules Example 2: let const m~2 ; var n:Boolean in begin n := m<4; n := n+1 end Type error! Type Rules

Semantics Specification of semantics is concerned with specifying the “meaning” of well-formed programs. Terminology: Expressions are evaluated and yield values (and may or may not perform side effects) Commands are executed and perform side effects. Declarations are elaborated to produce bindings Side effects: change the values of variables perform input/output

Phases of a Compiler A compiler’s phases are steps in transforming source code into object code. The different phases correspond roughly to the different parts of the language specification: Syntax analysis <-> Syntax Contextual analysis <-> Contextual constraints Code generation <-> Semantics

The “Phases” of a Compiler Source Program Syntax Analysis Error Reports Abstract Syntax Tree Contextual Analysis Error Reports Decorated Abstract Syntax Tree Code Generation Object Code

Compiler Passes A pass is a complete traversal of the source program, or a complete traversal of some internal representation of the source program. A pass can correspond to a “phase” but it does not have to! Sometimes a single “pass” corresponds to several phases that are interleaved in time. What and how many passes a compiler does over the source program is an important design decision.

Single Pass Compiler A single pass compiler makes a single pass over the source text, parsing, analyzing and generating code all at once. Dependency diagram of a typical Single Pass Compiler: Compiler Driver calls Syntactic Analyzer calls calls Contextual Analyzer Code Generator

Multi Pass Compiler A multi pass compiler makes several passes over the program. The output of a preceding phase is stored in a data structure and used by subsequent phases. Dependency diagram of a typical Multi Pass Compiler: Compiler Driver calls calls calls Syntactic Analyzer Contextual Analyzer Code Generator input Source Text output AST Decorated AST Object Code

Syntax Analysis Dataflow chart Source Program Stream of Characters Scanner Error Reports Stream of “Tokens” Parser Error Reports Abstract Syntax Tree

Regular Expressions RE are a notation for expressing a set of strings of terminal symbols. Different kinds of RE: e The empty string t Generates only the string t X Y Generates any string xy such that x is generated by x and y is generated by Y X | Y Generates any string which generated either by X or by Y X* The concatenation of zero or more strings generated by X (X) For grouping,

FA and the implementation of Scanners Regular expressions, (N)DFA-e and NDFA and DFA’s are all equivalent formalism in terms of what languages can be defined with them. Regular expressions are a convenient notation for describing the “tokens” of programming languages. Regular expressions can be converted into FA’s (the algorithm for conversion into NDFA-e is straightforward) DFA’s can be easily implemented as computer programs.

Parsing Parsing == Recognition + determining phrase structure (for example by generating AST) Different types of parsing strategies bottom up top down Recursive descent parsing What is it How to implement one given an EBNF specification Bottom up parsing algorithms

Top-down parsing . The sees a . cat rat The The cat cat sees sees a Subject Verb Object . Sentence Sentence Noun Subject The Verb sees a Noun Object . Noun cat Noun rat The The cat cat sees sees a rat rat . .

Bottom up parsing The The cat cat sees sees a a rat rat . . Sentence Subject Object Noun Verb Noun The The cat cat sees sees a a rat rat . .

Top-Down vs Bottom-Up parsing Look-Ahead Reduction LR-Analyse (Bottom-Up) Look-Ahead Derivation LL-Analyse (Top-Down)

Development of Recursive Descent Parser (1) Express grammar in EBNF (2) Grammar Transformations: Left factorization and Left recursion elimination (3) Create a parser class with private variable currentToken methods to call the scanner: accept and acceptIt (4) Implement private parsing methods: add private parseN method for each non terminal N public parse method that gets the first token form the scanner calls parseS (S is the start symbol of the grammar)

LL 1 Grammars The presented algorithm to convert EBNF into a parser does not work for all possible grammars. It only works for so called “LL 1” grammars. Basically, an LL1 grammar is a grammar which can be parsed with a top-down parser with a lookahead (in the input stream of tokens) of one token. What grammars are LL1? How can we recognize that a grammar is (or is not) LL1? We can deduce the necessary conditions from the parser generation algorithm. We can use a formal definition

Converting EBNF into RD parsers The conversion of an EBNF specification into a Java implementation for a recursive descent parser is so “mechanical” that it can easily be automated! => JavaCC “Java Compiler Compiler”

JavaCC and JJTree

LR parsing The algorithm makes use of a stack. The first item on the stack is the initial state of a DFA A state of the automaton is a set of LR0/LR1 items. The initial state is constructed from productions of the form S:= • a [, $] (where S is the start symbol of the CFG) The stack contains (in alternating) order: A DFA state A terminal symbol or part (subtree) of the parse tree being constructed The items on the stack are related by transitions of the DFA There are two basic actions in the algorithm: shift: get next input token reduce: build a new node (remove children from stack)

Bottom Up Parsers: Overview of Algorithms LR0 : The simplest algorithm, theoretically important but rather weak (not practical) SLR : An improved version of LR0 more practical but still rather weak. LR1 : LR0 algorithm with extra lookahead token. very powerful algorithm. Not often used because of large memory requirements (very big parsing tables) LALR : “Watered down” version of LR1 still very powerful, but has much smaller parsing tables most commonly used algorithm today

JavaCUP: A LALR generator for Java Definition of tokens Regular Expressions Grammar BNF-like Specification JFlex JavaCUP Java File: Scanner Class Recognizes Tokens Java File: Parser Class Uses Scanner to get Tokens Parses Stream of Tokens Syntactic Analyzer

Steps to build a compiler with SableCC Create a SableCC specification file Call SableCC Create one or more working classes, possibly inherited from classes generated by SableCC Create a Main class activating lexer, parser and working classes Compile with Javac

Contextual Analysis Phase Purposes: Finish syntax analysis by deriving context-sensitive information Associate semantic routines with individual productions of the context free grammar or subtrees of the AST Start to interpret meaning of program based on its syntactic structure Prepare for the final stage of compilation: Code generation

Contextual Analysis -> Decorated AST Annotations: Program result of identification LetCommand :type result of type checking SequentialCommand SequentialDeclaration AssignCommand AssignCommand BinaryExpr :int SimpleT VarDecl VarDecl Char.Expr VNameExp Int.Expr :char :int :int :int SimpleT SimpleV SimpleV :char :int Ident Ident Ident Ident Ident Char.Lit Ident Ident Op Int.Lit n Integer c Char c ‘&’ n n + 1

Nested Block Structure A language exhibits nested block structure if blocks may be nested one within another (typically with no upper bound on the level of nesting that is allowed). Nested There can be any number of scope levels (depending on the level of nesting of blocks): Typical scope rules: no identifier may be declared more than once within the same block (at the same level). for any applied occurrence there must be a corresponding declaration, either within the same block or in a block in which it is nested.

Type Checking For most statically typed programming languages, a bottom up algorithm over the AST: Types of expression AST leaves are known immediately: literals => obvious variables => from the ID table named constants => from the ID table Types of internal nodes are inferred from the type of the children and the type rule for that kind of expression

Contextual Analysis n Integer c Char c ‘&’ n n + 1 Identification and type checking are combined into a depth-first traversal of the abstract syntax tree. Program LetCommand SequentialDeclaration SequentialCommand AssignCommand AssignCommand BinaryExpression VarDec VarDec CharExpr VnameExpr IntExpr SimpleT SimpleT SimpleV SimpleV SimpleV Ident Ident Ident Ident Ident CharLit Ident Ident Op IntLit n Integer c Char c ‘&’ n n + 1

Visitor Solution using Visitor: Visitor is an abstract class that has a different method for each type of object on which it operates Each operation is a subclass of Visitor and overloads the type-specific methods Objects that are operated on, accept a Visitor and call back their type-specific method passing themselves as operands Object types are independent of the operations that apply to them New operations can be added without modifying the object types

Visitor Solution Node Accept( NodeVisitor v ) VariableRefNode Accept(NodeVisitor v) {v->VisitVariableRef(this)} AssignmentNode {v->VisitAssignment(this)} Nodes accept visitors and call appropriate method of the visitor Visitors implement the operations and have one method for each type of node they visit NodeVisitor VisitAssignment( AssignmentNode ) VisitVariableRef( VariableRefNode ) TypeCheckingVisitor CodeGeneratingVisitor

Runtime organization Data Representation: how to represent values of the source language on the target machine. Primitives, arrays, structures, unions, pointers Expression Evaluation: How to organize computing the values of expressions (taking care of intermediate results) Register vs. stack machine Storage Allocation: How to organize storage for variables (considering different lifetimes of global, local and heap variables) Activation records, static links Routines: How to implement procedures, functions (and how to pass their parameters and return values) Value vs. reference, closures, recursion Object Orientation: Runtime organization for OO languages Method tables

RECAP: TAM Frame Layout Summary Arguments for current procedure they were put here by the caller. arguments LB dynamic link static link return address Link data local variables and intermediate results Local data, grows and shrinks during execution. ST

Garbage Collection: Conclusions Relieves the burden of explicit memory allocation and deallocation. Software module coupling related to memory management issues is eliminated. An extremely dangerous class of bugs is eliminated. The compiler generates code for allocating objects The compiler must also generate code to support GC The GC must be able to recognize root pointers from the stack The GC must know about data-layout and objects descriptors

~ Code Generation Source Program Target program let var n: integer; var c: char in begin c := ‘&’; n := n+1 end PUSH 2 LOADL 38 STORE 1[SB] LOAD 0 LOADL 1 CALL add STORE 0[SB] POP 2 HALT ~ Source and target program must be “semantically equivalent” Semantic specification of the source language is structured in terms of phrases in the SL: expressions, commands, etc. => Code generation follows the same “inductive” structure.

Specifying Code Generation with Code Templates The code generation functions for Mini Triangle Phrase Class Function Effect of the generated code Program Command Expres- sion V-name Decla- ration run P execute C evaluate E fetch V assign V elaborate D Run program P then halt. Starting and finishing with empty stack Execute Command C. May update variables but does not shrink or grow the stack! Evaluate E, net result is pushing the value of E on the stack. Push value of constant or variable on the stack. Pop value from stack and store in variable V Elaborate declaration, make space on the stack for constants and variables in the decl.

Code Generation with Code Templates While command execute [while E do C] = JUMP h g: execute [C] h: evaluate[E] JUMPIF(1) g C E

Developing a Code Generator “Visitor” execute [C1 ; C2] = execute[C1] execute[C2] public Object visitSequentialCommand( SequentialCommand com,Object arg) { com.C1.visit(this,arg); com.C2.visit(this,arg); return null; } LetCommand, IfCommand, WhileCommand => later. - LetCommand is more complex: memory allocation and addresses - IfCommand and WhileCommand: complications with jumps

JVM JVM External representation platform independent internal representation implementation dependent .class files load classes primitive types integers objects arrays methods The JVM is an abstract machine in the true sense of the word. The JVM spec. does not specify implementation details (can be dependent on target OS/platform, performance requirements etc.) The JVM spec defines a machine independent “class file format” that all JVM implementations must support.

Inspecting JVM code % javac Factorial.java % javap -c -verbose Factorial Compiled from Factorial.java public class Factorial extends java.lang.Object { public Factorial(); /* Stack=1, Locals=1, Args_size=1 */ public int fac(int); /* Stack=2, Locals=4, Args_size=2 */ } Method Factorial() 0 aload_0 1 invokespecial #1 <Method java.lang.Object()> 4 return

Inspecting JVM Code ... // address: 0 1 2 3 Method int fac(int) // stack: this n result i 0 iconst_1 // stack: this n result i 1 1 istore_2 // stack: this n result i 2 iconst_2 // stack: this n result i 2 3 istore_3 // stack: this n result i 4 goto 14 7 iload_2 // stack: this n result i result 8 iload_3 // stack: this n result i result i 9 imul // stack: this n result i result i 10 istore_2 11 iinc 3 1 14 iload_3 // stack: this n result i i 15 iload_1 // stack: this n result i i n 16 if_icmple 7 // stack: this n result i 19 iload_2 // stack: this n result i result 20 ireturn

Code improvement (optimization) The code generated by our compiler is not efficient: It computes values at runtime that could be known at compile time It computes values more times than necessary We can do better! Constant folding Common sub-expression elimination Code motion Dead code elimination

Optimization implementation Is the optimization correct or safe? Is the optimization an improvement? What sort of analyses do we need to perform to get the required information? Local Global

Concurrency, distributed computing, the Internet Traditional view: Let the OS deal with this => It is not a programming language issue! End of Lecture Wait-a-minute … Maybe “the traditional view” is getting out of date?

Concurrency and programming Languages Library view All(most all) concurrency constructs are packaged as library functions/methods FORTRAN, C, Lisp, CML, Java (except for a few keywords like synchronized) Language view Syntactic manifestation in the language Concurrent Pascal Occam ADA Facile Erlang …

What could languages provide? Abstract model of system abstract machine => abstract system Example high-level constructs Process as the value of an expression Pass processes to functions Create processes at the result of function call Communication abstractions Synchronous communication Buffered asynchronous channels that preserve msg order Mutual exclusion, atomicity primitives Most concurrent languages provide some form of locking Atomicity is more complicated, less commonly provided

Programming Language Life cycle The requirements for the new language are identified The language syntax and semantics is designed BNF or EBNF, experiments with front-end tools Informal or formal Semantic An informal or formal specification is developed Initial implementation Prototype via interpreter or interpretive compiler Language tested by designers, implementers and a few friends Feedback on the design and possible reconsiderations Improved implementation

Programming Language Life cycle Design Specification Prototype Compiler Manuals, Textbooks

Programming Language Life cycle Lots of research papers Conferences session dedicated to new language Text books and manuals Used in large applications Huge international user community Dedicated conference International standardisation efforts Industry defacto standard Programs written in the languages becomes legacy code Language enters “hall-of-fame” and features are taught in CS course on Programming Language Design and Implementation

The Most Important Open Problem in Computing Increasing Programmer Productivity Write programs correctly Write programs quickly Write programs easily Why? Decreases support cost Decreases development cost Decreases time to market Increases satisfaction

Why Programming Languages? 3 ways of increasing programmer productivity: Process (software engineering) Controlling programmers Tools (verification, static analysis, program generation) Important, but generally of narrow applicability Language design --- the center of the universe! Core abstractions, mechanisms, services, guarantees Affect how programmers approach a task (C vs. SML) Multi-paradigm integration

Programming Languages and Compilers are at the core of Computing All software is written in a programming language Learning about compilers will teach you a lot about the programming languages you already know. Compilers are big – therefore you need to apply all you knowledge of software engineering. The compiler is the program from which all other programs arise.

New Programming Language! Why Should I Care? The problem is not designing a new language It’s easy! Thousands of languages have been developed The problem is how to get wide adoption of the new language It’s hard! Challenges include Competition Usefulness Interoperability Fear “It’s a good idea, but it’s a new idea; therefore, I fear it and must reject it.” --- Homer Simpson The financial rewards are low, but …

Famous Danish Computer Scientists Peter Nauer BNF and Algol Per Brinck Hansen Monitors and Concurrent Pascal Dines Bjorner VDM and ADA Bjarne Straustrup C++ Mads Tofte SML Rasmus Lerdorf PhP Anders Hejlsberg Turbo Pascal and C# Jacob Nielsen

Finally Keep in mind, the compiler is the program from which all other programs arise. If your compiler is under par, all programs created by the compiler will also be under par. No matter the purpose or use -- your own enlightenment about compilers or commercial applications -- you want to be patient and do a good job with this program; in other words, don't try to throw this together on a weekend. Asking a computer programmer to tell you how to write a compiler is like saying to Picasso, "Teach me to paint like you." *Sigh* Well, Picasso tried.

What I promised you at the start of the course Ideas, principles and techniques to help you Design your own programming language or design your own extensions to an existing language Tools and techniques to implement a compiler or an interpreter Lots of knowledge about programming I hope you feel you got what I promised