Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 426 Compiler Construction 1. Introduction. Prolog.

Similar presentations


Presentation on theme: "CS 426 Compiler Construction 1. Introduction. Prolog."— Presentation transcript:

1 CS 426 Compiler Construction 1. Introduction

2 Prolog

3 ›This is a course about designing and building programming language analyzers and translators. ›A fascinating topic: –Compilers as translators facilitate programming (increase productivity) by ›Presenting a high-level interface ( high level language ) ›Enabling target system independence for portability ›Detecting errors, defects ›Applying optimizations. When they work, programmers struggle less to tune the program to target machine. Acceptance of a programming language in some cases depends on compiler effectiveness. ›A bridge across areas –Languages –Machines –Theory, decidability, complexity

4 ›Program translation and analysis are among the oldest Computer Science subject and have numerous applications. –Implementation of compilers that translate high level languages onto machine language. A crucial part of computing since the early days. –Implementation of efficient interpreters. –Program analysis for ›Optimization ›Parallelization/vectorization ›Refactoring for readability ›Static error detection. ›Security –Binary translation cross platforms to increase availability of software –Hardware synthesis, to translate from notations like Verilog and VHDL onto RTL (register transfer language) –Database query implementation and optimization

5 ›Performance today –Crucial for some applications: ›Real time systems ›Games ›Computational sciences –Not so much for others ›Highly interactive programs that mainly wait for user some pf the time and I/O the rest of the time. ›Computations for which interactiveness is more important thatn speed (e.g. MATLAB, R, …)

6 The first commercial compiler

7 A little bit of compiler history In the beginning there was FORTRAN ›The compiler (ca. 1956) was a momentous accomplishment ›The top compiler algorithm of the century ›Accomplishment by John Backus and his small team even more impressive given how little was known then. IEEE Computing in Science and Engineering, Jan 2000

8 Fortran I Subscript evaluation ›The address of array element A(I,J,c3*K+6) is base_A+I-1+(J-1)*D1+(c3*K+6-1*DI*DJ) ›There was no strength reduction, induction variable analysis, nor data flow analysis. ›They used a pattern matching so that every time “ K is increased by n (under the control of a DO ), the index quantity is increased by c3 DI DJ n, giving the correct value” (Backus, Western Joint Computer Conference, 1957)

9 Fortran I Induction variable analysis ›“… it was not practical to track down and identify linear changes in subscripts resulting from assignment statements. Thus, the sole criterion …for efficient handling of array references was to be that the subscripts involved were being controlled by DO statements”

10 Fortran I Operator precedence in Fortran I ›A big deal. ›“The lack of operator priority (often called precedence …) in the IT language was the most frequent single cause of errors by the users of that compiler” Donald Knuth. ›The Fortran I algorithm: –Replace + and – with ))+(( and ))-(( respectively –Replace * and / with )*( and )/(, respectively –Add (( at the beginning of each expression and after each left parenthesis in the original expression. –Add )) at the end and before each right parenthesis ›“The resulting formula is properly parenthesized, believe it or not” D. Knuth

11 Fortran I Register allocation ›Extremely complex ›Used to manage the three index registers of the 704 ›“… much of it was carried along into Fortran II and still in use in the 705/9/90. In many programs. In many programs it still contributes to the production of better code than can be achieved on the new Fortran IV compiler.” Saul Rosen

12 Fortran I A difficult chore ›“… didn’t really work when it was delivered.” ›“ At first people thought it would never be done.” › “Then when it was in field test, with many bugs…, many thought it would never work. “ ›“Fortran is now almost taken for granted, as if it were built into the computer hardware.” Saul Rosen, 1967

13 Fortran I The challenge then ›“It was our belief that if FORTRAN, during its first months, were to translate any reasonable “scientific” source program into an object program only half as fast as its hand coded counterpart, then acceptance of our system would be in serious danger.” John Backus ›How close they come to this goal? Hard to tell ›But we know they succeeded and this conference is a clear testimony of their success

14 Language processors

15 ›Languages can be –Translated ›Compiler ›Source-to-source –Interpreted –Processed by a combination of these two approaches ›Translation compiler source programtarget program linker Executable executable input output

16 Language processors (Cont.) ›Translation (Cont.) source-to- source translator source program (language A) source program (language B) compiler target program linker Executable executable input output

17 Language processors (Cont.) ›Translation (Cont.) translator source program byte code virtual machine input output Just-in-time compiler executable

18 The inside of a compiler Lexical analyzer Character stream Syntax analyzer Token stream Semantic analyzer Abstract syntax tree High level optimizer Abstract syntax tree Intermediate code generator Abstract syntax tree Low level optimizer Intermediate representation Code generator Intermediate representation Machine- specific optimizer Target machine code Symbol Table

19 The inside of a compiler Lexical analyzer Character stream Syntax analyzer Token stream Semantic analyzer Low level optimizer Intermediate representation Code generator Intermediate representation Machine- specific optimizer Target machine code

20 The inside of a compiler Lexical analyzer Character stream Syntax analyzer Token stream Semantic analyzer Abstract syntax tree High level optimizer Abstract syntax tree Intermediate code generator Abstract syntax tree Low level optimizer Intermediate representation Code generator Intermediate representation Machine- specific optimizer Target machine code Source to source optimizer X High level language

21 The inside of a compiler Lexical analyzer Character stream Syntax analyzer Token stream Semantic analyzer Abstract syntax tree High level optimizer Abstract syntax tree Intermediate code generator Abstract syntax tree Low level optimizer Intermediate representation Code generator Intermediate representation Machine- specific optimizer Target machine code Translator (for interpreter) X Byte code

22 The inside of a compiler Lexical analyzer Character stream Syntax analyzer Token stream Semantic analyzer Abstract syntax tree High level optimizer Abstract syntax tree Intermediate code generator Abstract syntax tree Low level optimizer Intermediate representation Code generator Intermediate representation Machine- specific optimizer Target machine code Front end

23 The inside of a compiler Lexical analyzer Character stream Syntax analyzer Token stream Semantic analyzer Low level optimizer Intermediate representation Code generator Intermediate representation Machine- specific optimizer Target machine code Front end

24 The front end

25 ›It accepts the input language, including comments, pragmas, and macros ›Translates text into data that is more easily manipulable by the compiler. –Abstract syntax tree, or –Intermediate representation ›Detects and reports syntactic and semantic errors. ›It is built based on a description of the source language –Formal for the syntax. –Informal (typically) for the semantics (although much has been done to formalize the semantics). 25

26 Backus-Naur Form (BNF) ›Introduced by John Backus to formally describe IAL [ J. W. Backus, The syntax and semantics of the proposed international algebraic language of the Zürich ACM-GRAMM conference. ICIP Paris, June 1959.] ›Adopted to represent ALGOL 60. ›Widely, but not universally, used to describe syntax today (with some extensions). ›A formal description enables automatic (or semi-automatic) generation of lexers and parsers.

27 BNF of simple syntactic objects

28

29 BNF of a part of C

30 Example 1 of modified BNF (Modula 2)

31 Example 2 of Modified BNF (Also Modula 2)

32 Example 3 of modified BNF (Fortran 95)

33 Example 4 of modified BNF (Java)

34 Parsing ›Parsing is the process used to –Determine if a string of characters belongs to the language described by the BNF –Create the parse tree (not to be confused with the syntax tree in the textbook which is called abstract syntax tree in these slides) ›The parse tree is seldom explicitly computed and the syntax analyzer typically generates an abstract syntax tree or intermediate code directly.

35 Example of parse tree A_1 123

36 Example of parse tree 1 * k + x / 5

37 Formal notion of a grammar ›The BNF description of syntax involves four concepts –Nonterminals: syntactic categories from which elements of the language can be derived. These are all the symbols on the LHS of the rules. (e.g., ) –Terminals: The actual elements of the language that are not expanded further. They do not appear on the left hand side of any rule (e.g. +, A,…) ›Note: For practical reasons, parsing is typically done in two phases. First some objects like and are recognized by the lexical scanning phase. Then, the rest of the language is parsed assuming the objects recognized by lexical scanning are terminals. –Productions: The rules of the language, relating nonterminals and terminals. –The root: A distinguished not terminal that will be the root of all parse trees for elements of the language.

38 Formal notion of a grammar

39

40 Classes of grammars

41 Formal notion of a grammar

42 Multiple grammars, single language ›Different grammars can be equivalent, i.e. they generate the same language. ›Grammars can be modified to remove “undesirable properties” ›For example, it is better for the grammar not to be ambiguous. That is for it not to allow multiple parse trees for a given element of the language.

43 Example of ambiguous grammar (1/3)

44 Example of ambiguous grammar (2/3) 1 + 2 + 3 1 + 2 + 3 4 * 5 + 6 4 * 5 + 6

45 Example of ambiguous grammar (3/3) ›The examples above also show that we need the right grammar to represent –Associativity (Sec. 2.2.5) –Precedence of operators (Sec. 2.2.6)

46 Left recursive grammar

47 A very simple compiler

48 Our first compiler

49 ›Thus, for the assignment A = -A + 5 * B / (B-1) ›The compiler should generate ›LIT A ›LOAD ›NEG ›LIT 5 ›LIT B ›LOAD ›MUL ›LIT B ›LOAD ›LIT 1 ›NEG ›ADD ›DIV ›ADD ›STORE

50 Grammar

51 Recursive descent compiler ›There is only one variable: token, which has the value of the next character in the input line. ›The main program is as follows: char token; token= nextchar(); // nextchar() skips spaces assignment();

52 Recursive descent compiler identifier(){ print(“LIT”); print(token); token=nextchar(); } integer(){ print(“LIT”); print(token); token=nextchar(); }

53 Recursive descent compiler //process sequence of +/- while (token == “-” | token == “+”){ char t=token; token=nextchar(); term(); if t ==“-” emit(“NEG”) emit(“ADD”) }

54 Recursive descent compiler

55


Download ppt "CS 426 Compiler Construction 1. Introduction. Prolog."

Similar presentations


Ads by Google