1. Course Goals To provide students with an understanding of the major phases of a compiler. To introduce students to the theory behind the various phases,

Slides:



Advertisements
Similar presentations
Introduction to Compiler Construction
Advertisements

From Cooper & Torczon1 The Front End The purpose of the front end is to deal with the input language Perform a membership test: code  source language?
Lexical Analysis — Introduction Comp 412 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
Compiler Chang Chi-Chung Textbook Compilers: Principles, Techniques, and Tools, 2/E.  Alfred V. Aho, Columbia University  Monica S. Lam,
From Cooper & Torczon1 Implications Must recognize legal (and illegal) programs Must generate correct code Must manage storage of all variables (and code)
Compiler Construction
BİL744 Derleyici Gerçekleştirimi (Compiler Design)1.
Lecture 2 Phases of Compiler. Preprocessors, Compilers, Assemblers, and Linkers Preprocessor Compiler Assembler Linker Skeletal Source Program Source.
September 7, September 7, 2015September 7, 2015September 7, 2015 Azusa, CA Sheldon X. Liang Ph. D. Computer Science at Azusa Pacific University.
Course Revision Contents  Compilers  Compilers Vs Interpreters  Structure of Compiler  Compilation Phases  Compiler Construction Tools  A Simple.
COP4020 Programming Languages
Introduction to Compiler Construction Robert van Engelen COP5621 Compiler Construction Copyright Robert.
Lexical Analysis - An Introduction. The Front End The purpose of the front end is to deal with the input language Perform a membership test: code  source.
Lexical Analysis - An Introduction Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at.
어휘분석 (Lexical Analysis). Overview Main task: to read input characters and group them into “ tokens. ” Secondary tasks: –Skip comments and whitespace;
Lexical Analysis - An Introduction Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at.
Compiler course 1. Introduction. Outline Scope of the course Disciplines involved in it Abstract view for a compiler Front-end and back-end tasks Modules.
CST320 - Lec 11 Why study compilers? n n Ties lots of things you know together: –Theory (finite automata, grammars) –Data structures –Modularization –Utilization.
Unit-1 Introduction Prepared by: Prof. Harish I Rathod
1.  10% Assignments/ class participation  10% Pop Quizzes  05% Attendance  25% Mid Term  50% Final Term 2.
Compiler design Lecture 1: Compiler Overview Sulaimany University 2 Oct
1. 2 Preface In the time since the 1986 edition of this book, the world of compiler design has changed significantly 3.
Introduction to Compilers. Related Area Programming languages Machine architecture Language theory Algorithms Data structures Operating systems Software.
Topic #1: Introduction EE 456 – Compiling Techniques Prof. Carl Sable Fall 2003.
Overview of Previous Lesson(s) Over View  A program must be translated into a form in which it can be executed by a computer.  The software systems.
1 Compiler Design (40-414)  Main Text Book: Compilers: Principles, Techniques & Tools, 2 nd ed., Aho, Lam, Sethi, and Ullman, 2007  Evaluation:  Midterm.
Chapter 1 Introduction Study Goals: Master: the phases of a compiler Understand: what is a compiler Know: interpreter,compiler structure.
Introduction to Compiling
Compiler Introduction 1 Kavita Patel. Outlines 2  1.1 What Do Compilers Do?  1.2 The Structure of a Compiler  1.3 Compilation Process  1.4 Phases.
Compiler Construction Dr. Naveed Ejaz Lecture 4. 2 The Back End Register Allocation:  Have each value in a register when it is used. Instruction selection.
Compiler Construction By: Muhammad Nadeem Edited By: M. Bilal Qureshi.
What is a compiler? –A program that reads a program written in one language (source language) and translates it into an equivalent program in another language.
Compiler Construction CPCS302 Dr. Manal Abdulaziz.
1 Asstt. Prof Navjot Kaur Computer Dept PRESENTED BY.
CMPSC 160 Translation of Programming Languages Fall 2002 Instructor: Hugh McGuire Lecture 2 Phases of a Compiler Lexical Analysis.
CS 404Ahmed Ezzat 1 CS 404 Introduction to Compiler Design Lecture 1 Ahmed Ezzat.
1 Compiler Construction Vana Doufexi office CS dept.
Presented by : A best website designer company. Chapter 1 Introduction Prof Chung. 1.
CS416 Compiler Design1. 2 Course Information Instructor : Dr. Ilyas Cicekli –Office: EA504, –Phone: , – Course Web.
Lecture 02: Compiler Overview Kwangman Man ( SangJi University.
Chapter 1 Introduction Samuel College of Computer Science & Technology Harbin Engineering University.
Compilers Principles, Techniques, & Tools Taught by Jing Zhang
System Software Theory (5KS03).
Advanced Computer Systems
Compiler Design (40-414) Main Text Book:
PRINCIPLES OF COMPILER DESIGN
Chapter 1 Introduction.
Introduction to Compiler Construction
Compiler Construction (CS-636)
Introduction.
Chapter 1 Introduction.
课程名 编译原理 Compiling Techniques
Compiler Lecture 1 CS510.
Compiler Construction
CS416 Compiler Design lec00-outline September 19, 2018
Introduction to Compiler Construction
Course supervisor: Lubna Siddiqui
Introduction CI612 Compiler Design CI612 Compiler Design.
Lexical Analysis - An Introduction
Lecture 02: Compiler Overview
Compiler Construction
CS416 Compiler Design lec00-outline February 23, 2019
Lexical Analysis - An Introduction
Introduction to Compiler Construction
Compiler Construction
Compilers Principles, Techniques, & Tools Taught by Jing Zhang
Compiler Construction
Lec00-outline May 18, 2019 Compiler Design CS416 Compiler Design.
Introduction to Compiler Construction
Faculty of Computer Science and Information System
Presentation transcript:

1

Course Goals To provide students with an understanding of the major phases of a compiler. To introduce students to the theory behind the various phases, including regular expressions, context-free grammars, and finite state automata. To provide students with an understanding of the design and implementation of a compiler. To have the students build a compiler, through type checking and intermediate code generation, for a small language. To provide students with an opportunity to work in a group on a large project. 2

Course Outcomes Students will have experience using current compiler generation tools. Students will be familiar with the different phases of compilation. Students will have experience defining and specifying the semantic rules of a programming language 3

Prerequisites In-depth knowledge of at least one structured programming language. Strong background in algorithms, data structures, and abstract data types, including stacks, binary trees, graphs. Understanding of grammar theories. Understanding of data types and control structures, their design and implementation. Understanding of the design and implementation of subprograms, parameter passing mechanisms, scope. 4

Major Topics Covered in the Course Overview & Lexical Analysis (Scanning) Grammars & Syntax Analysis: Top-Down Parsing Syntax Analysis: Bottom-Up Parsing Semantic Analysis Symbol Tables and Run-time Systems Code Generation Introduction to Optimization and Control Flow Analysis 5

Textbook Compilers: Principles, Techniques, and Tools” by Aho, Lam, Sethi, and Ullman, 2 nd edition. 6

GRADING Assignements & project: 40 Midterm Exam: 20 Final Exam: 40 7

Compilers and Interpreters “Compilation” Translation of a program written in a source language into a semantically equivalent program written in a target language Compiler Error messages Source Program Target Program Input Output 8

Compilers and Interpreters (cont’d) Interpreter Source Program Input Output Error messages “Interpretation” – Performing the operations implied by the source program 9

The Analysis-Synthesis Model of Compilation There are two parts to compilation: – Analysis determines the operations implied by the source program which are recorded in a tree structure – Synthesis takes the tree structure and translates the operations therein into the target program 10

Preprocessors, Compilers, Assemblers, and Linkers Preprocessor Compiler Assembler Linker Skeletal Source Program Source Program Target Assembly Program Relocatable Object Code Absolute Machine Code Libraries and Relocatable Object Files Try for example: gcc -v myprog.c 11

The Phases of a Compiler PhaseOutputSample Programmer (source code producer)Source string A=B+C; Scanner (performs lexical analysis)Token string ‘A’, ‘=’, ‘B’, ‘+’, ‘C’, ‘;’ And symbol table with names Parser (performs syntax analysis based on the grammar of the programming language) Parse tree or abstract syntax tree ; | = / \ A + / \ B C Semantic analyzer (type checking, etc) Annotated parse tree or abstract syntax tree Intermediate code generatorThree-address code, quads, or RTL int2fp B t1 + t1 C t2 := t2 A OptimizerThree-address code, quads, or RTL int2fp B t1 + t1 #2.3 A Code generatorAssembly code MOVF #2.3,r1 ADDF2 r1,r2 MOVF r2,A Peephole optimizerAssembly code ADDF2 #2.3,r2 MOVF r2,A 12

The Grouping of Phases Compiler front and back ends: – Front end: analysis (machine independent) – Back end: synthesis (machine dependent) Compiler passes: – A collection of phases is done only once (single pass) or multiple times (multi pass) Single pass: usually requires everything to be defined before being used in source program Multi pass: compiler may have to keep entire program representation in memory 13

Compiler-Construction Tools Software development tools are available to implement one or more compiler phases – Scanner generators – Parser generators – Syntax-directed translation engines – Automatic code generators – Data-flow engines 14

What qualities do you want in a that compiler you buy 1. Correct Code 2. Output runs fast 3. Compiler runs fast 4. Compile time proportional to program size 5. Support for separate compilation 6. Good diagnostics for syntax errors 7. Works well with debugger 8. Good diagnostics for flow anomalies 9. Good diagnostics for storage leaks 10. Consistent, predictable optimization 15

16 High-level View of a Compiler Source code Machine code Compiler Errors Implications Must recognize legal (and illegal) programs Must generate correct code Must manage storage of all variables (and code) Must agree with OS & linker on format for object code

17 Traditional Two-pass Compiler Source code Front End Errors Machine code Back End IR Use an intermediate representation (IR) Front end maps legal source code into IR Back end maps IR into target machine code Admits multiple front ends & multiple passes (better code)

18 The Front End Source code Scanner IR Parser Errors tokens Responsibilities Recognize legal (& illegal) programs Report errors in a useful way Produce IR & preliminary storage map Shape the code for the back end Much of front end construction can be automated

19 The Front End Source code Scanner IR Parser Errors tokens Scanner Maps character stream into words—the basic unit of syntax Produces words & their parts of speech x = x + y ; becomes ; word  lexeme, part of speech  token In casual speech, we call the pair a token Typical tokens include number, identifier, +, -, while, if Scanner eliminates white space Speed is important  use a specialized recognizer

20 The Front End Source code Scanner IR Parser Errors tokens Parser Recognizes context-free syntax & reports errors Guides context-sensitive analysis (type checking) Builds IR for source program Hand-coded parsers are fairly easy to build Most books advocate using automatic parser generators

21 The Front End Compilers often use an abstract syntax tree This is much more concise AST s are one form of intermediate representation ( IR ) + - The AST summarizes grammatical structure, without including detail about the derivation

22 The Back End Errors IR Instruction Scheduling Instruction Selection Machine code Register Allocation IR Responsibilities Translate IR into target machine code Choose instructions to implement each IR operation Decide which value to keep in registers Ensure conformance with system interfaces Automation has been much less successful in the back end

23 The Back End Errors IR Instruction Scheduling Instruction Selection Machine code Register Allocation IR Instruction Selection Produce fast, compact code Take advantage of target features such as addressing modes Usually viewed as a pattern matching problem ad hoc methods, pattern matching, dynamic programming This was the problem of the future in 1978 Spurred by transition from PDP-11 to VAX-11 Orthogonality of RISC simplified this problem

24 The Back End Errors IR Instruction Scheduling Instruction Selection Machine code Register Allocation IR Instruction Scheduling Avoid hardware stalls and interlocks Use all functional units productively Can increase lifetime of variables (changing the allocation) Optimal scheduling is NP-Complete in nearly all cases Good heuristic techniques are well understood

25 The Back End Errors IR Instruction Scheduling Instruction Selection Machine code Register Allocation IR Register allocation Have each value in a register when it is used Manage a limited set of resources Can change instruction choices & insert LOADs & STOREs Optimal allocation is NP-Complete (1 or k registers) Compilers approximate solutions to NP-Complete problems

26 Traditional Three-pass Compiler Errors Source Code Middle End Front End Machine code Back End IR Code Improvement (or Optimization) Analyzes IR and rewrites (or transforms) IR Primary goal is to reduce running time of the compiled code May also improve space, power consumption, … Must preserve “meaning” of the code Measured by values of named variables

27 The Optimizer (or Middle End) Errors Opt1Opt1 Opt3Opt3 Opt2Opt2 OptnOptn... IR Modern optimizers are structured as a series of passes Typical Transformations Discover & propagate some constant value Move a computation to a less frequently executed place Discover a redundant computation & remove it Remove useless or unreachable code

The Big Picture Why study lexical analysis? We want to avoid writing scanners by hand Scanner Generator specifications source codeparts of speech tables or code Goals: To simplify specification & implementation of scanners To understand the underlying techniques and technologies 28

Lexical Analysis The lexical analyzer reads the stream of characters making up the source program and groups the characters into meaningful sequences called lexemes. For each lexeme, the lexical analyzer produces as output a token of the form (token-name, attribute-value) the first component token-name is an abstract symbol that is used during syntax analysis, and the second component attribute-value points to an entry in the symbol table for this token. 29

Example suppose a source program contains the assignment statement: position = i n i t i a l + r a t e * 60 The characters in this assignment could grouped into the following lexemes and mapped into the following tokens passed on to the syntax analyzer: 1.position is a lexeme that would be mapped into a token (id, I). 2.The assignment symbol = is a lexeme that is mapped into the token (=). 3.i n i t i a l is a lexeme that is mapped into the token (id, 2). 4.+ is a lexeme that is mapped into the token (+). 5.r a t e is a lexeme that is mapped into the token (id, 3). 6.* is a lexeme that is mapped into the token (*) is a lexeme that is mapped into the token (60). 30

31

32

Definition of Grammars A context-free grammar has four components: 1. A set of terminal symbols, sometimes referred to as "tokens." The terminals are the elementary symbols of the language defined by the grammar. 2. A set of nonterminals, sometimes called "syntactic variables." Each nonterminal represents a set of strings of terminals, in a manner we shall describe. 3. A set of productions, where each production consists of a nonterminal, called the head or left side of the production, an arrow, and a sequence of terminals and/or nonterminals, called the body or right side of the production.The intuitive intent of a production is to specify one of the written forms of a construct; if the head nonterminal represents a construct, then the body represents a written form of the construct. 4. A designation of one of the nonterminals as the start symbol. 33

Example The following grammar describes the syntax of expressions consisting of digits and plus and minus signs; e.g., strings such as 9-5+2, 3-1, or 7 The productions are: list -+ list + digit list -+ list - digit list -+ digit digit -+ 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |9 The bodies of the three productions with nonterminal list as head equivalently can be grouped: list + list + digit | list - digit | digit According to our conventions, the terminals of the grammar are the symbols + = The nonterminals are the italicized names list and digit, with list being the start symbol because its productions are given first. 34

Specifying Lexical Patterns (micro-syntax) A scanner recognizes the language’s parts of speech Some parts are easy White space – WhiteSpace  blank | tab | WhiteSpace blank | WhiteSpace tab Keywords and operators – Specified as literal patterns: if, then, else, while, =, +, … Comments – Opening and (perhaps) closing delimiters – /* followed by */ in C – // in C++ – % in LaTeX 35

A scanner recognizes the language’s parts of speech Some parts are more complex Identifiers – Alphabetic followed by alphanumerics + _, &, $, … – May have limited length Numbers – Integers: 0 or a digit from 1-9 followed by digits from 0-9 – Decimals: integer. digits from 0-9, or. digits from 0-9 – Reals: (integer or decimal) E (+ or -) digits from 0-9 – Complex: ( real, real ) We need a notation for specifying these patterns We would like the notation to lead to an implementation Specifying Lexical Patterns (micro-syntax) 36

Regular Expressions Patterns form a regular language *** any finite language is regular *** Regular expressions (REs) describe regular languages Regular Expression (over alphabet  )  is a RE denoting the set {  } If a is in , then a is a RE denoting {a} If x and y are REs denoting L(x) and L(y) then – x is a RE denoting L(x) – x |y is a RE denoting L(x)  L(y) – xy is a RE denoting L(x)L(y) – x * is a RE denoting L(x)* Precedence is closure, then concatenation, then alternation Ever type “rm *.o a.out” ? 37

Set Operations (refresher) You need to know these definitions 38

Examples of Regular Expressions Identifiers: Letter  (a|b|c| … |z|A|B|C| … |Z) Digit  (0|1|2| … |9) Identifier  Letter ( Letter | Digit ) * Numbers: Integer  (+|-|  ) (0| (1|2|3| … |9)(Digit * ) ) Decimal  Integer. Digit * Real  ( Integer | Decimal ) E (+|-|  ) Digit * Complex  ( Real, Real ) Numbers can get much more complicated! 39

Regular Expressions (the point) To make scanning tractable, programming languages differentiate between parts of speech by controlling their spelling (as opposed to dictionary lookup) Difference between Identifier and Keyword is entirely lexical – While is a Keyword – Whilst is an Identifier The lexical patterns used in programming languages are regular Using results from automata theory, we can automatically build recognizers from regular expressions  We study REs to automate scanner construction ! 40

Consider the problem of recognizing register names Register  r (0|1|2| … | 9) (0|1|2| … | 9) * Allows registers of arbitrary number Requires at least one digit RE corresponds to a recognizer (or DFA) With implicit transitions on other inputs to an error state, s e Example S0S0 S2S2 S1S1 r (0|1|2| … 9) accepting state (0|1|2| … 9) Recognizer for Register 41

DFA operation Start in state S 0 & take transitions on each input character DFA accepts a word x iff x leaves it in a final state (S 2 ) So, r17 takes it through s 0, s 1, s 2 and accepts r takes it through s 0, s 1 and fails a takes it straight to s e Example (continued) S0S0 S2S2 S1S1 r (0|1|2| … 9) accepting state (0|1|2| … 9) Recognizer for Register 42

Example (continued) char  next character; state  s 0 ; call action(state,char); while (char  eof) state   (state,char); call action(state,char); char  next character; if  (state) = final then report acceptance; else report failure; action(state,char) switch(  (state) ) case start: word  char; break; case normal: word  word + char; break; case final: word  char; break; case error: report error; break; end; The recognizer translates directly into code To change DFA s, just change the tables 43

r Digit Digit * allows arbitrary numbers Accepts r00000 Accepts r99999 What if we want to limit it to r0 through r31 ? Write a tighter regular expression – Register  r ( (0|1|2) (Digit |  ) | (4|5|6|7|8|9) | (3|30|31) – Register  r0|r1|r2| … |r31|r00|r01|r02| … |r09 Produces a more complex DFA Has more states Same cost per transition Same basic implementation What if we need a tighter specification? 44

Tighter register specification (continued) The DFA for Register  r ( (0|1|2) (Digit |  ) | (4|5|6|7|8|9) | (3|30|31) Accepts a more constrained set of registers Same set of actions, more states S0S0 S5S5 S1S1 r S4S4 S3S3 S6S6 S2S2 0,1,20,1,2 3 0,10,1 4,5,6,7,8,94,5,6,7,8,9 (0|1|2| … 9) 45

Tighter register specification (continued) To implement the recognizer Use the same code skeleton Use transition and action tables for the new RE Bigger tables, more space, same asymptotic costs Better (micro-)syntax checking at the same cost 46