Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Compiler Construction Overview. 2 Today’s Goals Summary of the subjects we’ve covered Perspectives and final remarks.

Similar presentations


Presentation on theme: "1 Compiler Construction Overview. 2 Today’s Goals Summary of the subjects we’ve covered Perspectives and final remarks."— Presentation transcript:

1 1 Compiler Construction Overview

2 2 Today’s Goals Summary of the subjects we’ve covered Perspectives and final remarks

3 3 High-level View Definitions Compiler consumes source code & produces target code usually translate high-level language programs into machine code Interpreter consumes executables & produces results virtual machine for the input code

4 4 Why Study Compilers? Compilers are important Enabling technology for languages, software development Allow programmers to focus on problem solving, hiding the hardware complexity Responsible for good system performance Compilers are useful Language processing is broadly applicable Compilers are fun Combine theory and practice Overlap with other CS subjects Hard problems Engineering and trade-offs Got a taste in the labs!

5 5 Structure of Compilers

6 6 The Front-end

7 7 Lexical Analysis Scanner Maps character stream into tokens Automate scanner construction Define tokens using Regular Expressions Construct NFA (Nondeterministic Finite Automata) to recognize REs Transform NFA to DFA Convert NFA to DFA through subset construction DFA minimization (set split) Building scanners from DFA Tools ANTLR, lex

8 8 Syntax Analysis Parsing language using CFG (context-free grammar) CFG grammar theory Derivation Parse tree Grammar ambiguity Parsing Top-down parsing recursive descent table-driven LL(1) Bottom-up parsing LR(1) shift reduce parsing Operator precedence parsing Operator precedence parsing

9 9 Top-down Predictive Parsing Basic idea Build parse tree from root. Given A → α | β, use look-ahead symbol to choose between α & β Recursive descent Table-driven LL(1) Left recursion elimination

10 10 Bottom-up Shift-Reduce Parsing Build reverse rightmost derivation The key is to find handle (rhs of production) All active handles include top of stack (TOS) Shift inputs until TOS is right end of a handle Language of handles is regular (finite) Build a handle-recognizing DFA ACTION & GOTO tables encode the DFA

11 11 Semantic Analysis Analyze context and semantics types and other semantic checks Attribute grammar associate evaluation rules with grammar production Ad-hoc build symbol table

12 12 Intermediate Representation

13 13 Intermediate Representation Front-end translates program into IR format for further analysis and optimization IR encodes the compiler’s knowledge of the program Largely machine-independent Move closer to standard machine model AST Tree: high-level Linear IR: low-level ILOC 3-address code Assembly-level operations Expose control flow, memory addressing unlimited virtual registers

14 14 Procedure Abstraction Procedure is key language construct for building large systems Name Space Caller-callee interface: linkage convention Control transfer Context protection Parameter passing and return value Run-time support for nested scopes Activation record, access link, display Inheritance and dynamic dispatch for OO multiple inheritance virtual method table

15 15 The Back-end

16 16 The Back-end Instruction selection Mapping IR into assembly code Assumes a fixed storage mapping & code shape Combining operations, using address modes Instruction scheduling Reordering operations to hide latencies Assumes a fixed program (set of operations) Changes demand for registers Register allocation Deciding which values will reside in registers Changes the storage mapping, may add false sharing Concerns about placement of data & memory operations

17 17 Code Generation Expressions Recursive tree walk on AST Direct integration with parser Assignment Array reference Boolean & Relational Values If-then-else Case Loop Procedure call

18 18 Instruction Selection Hand-coded tree-walk code generator Automatic instruction selection Pattern matching Peephole Matching Tree-pattern matching through tiling

19 19 Instruction Scheduling The Problem Given a code fragment for some target machine and the latencies for each individual operation, reorder the operations to minimize execution time Build Precedence Graph List scheduling NP-complete problem Heuristics work well for basic blocks forward list scheduling backward list scheduling Scheduling for larger regions EBB and cloning Trace scheduling

20 20 Register Allocation Local register allocation top-down bottom-up Global register allocation Find live-range Build an interference graph GI Construct a k-coloring of interference graph Map colors onto physical registers

21 21 Web-based Live Ranges Connect common defs and uses Solve the Reaching data-flow problem!

22 22 Interference Graph The interference graph, G I Nodes in G I represent live ranges Edges in G I represent individual interferences For x, y ∈ G I, ∈ iff x and y interfere A k-coloring of G I can be mapped into an allocation to k registers

23 23 Key Observation on Coloring Any vertex n that has fewer than k neighbors in the interference graph (n°< k) can always be colored ! Remove nodes n°< k for G I ’, coloring for G I ’ is also coloring for G I

24 24 Chaitin’s Algorithm   While ∃ vertices with < k neighbors in G I   Pick any vertex n such that n°< k and put it on the stack   Remove that vertex and all edges incident to it from G I This will lower the degree of n’s neighbors   If G I is non-empty (all vertices have k or more neighbors) then:   Pick a vertex n (using some heuristic) and spill the live range associated with n   Remove vertex n from G I, along with all edges incident to it and put it on the stack   If this causes some vertex in G I to have fewer than k neighbors, then go to step 1; otherwise, repeat step 2   If no spill, successively pop vertices off the stack and color them in the lowest color not used by some neighbor; otherwise, insert spill code, recompute G I and start from step 1

25 25 Brigg’s Improvement Nodes can still be colored even with > k neighbors if some neighbors have same color   While ∃ vertices with < k neighbors in G I   Pick any vertex n such that n°< k and put it on the stack   Remove that vertex and all edges incident to it from G I This may create vertices with fewer than k neighbors   If G I is non-empty (all vertices have k or more neighbors) then:   Pick a vertex n (using some heuristic condition), push n on the stack and remove n from G I, along with all edges incident to it   If this causes some vertex in G I to have fewer than k neighbors, then go to step 1; otherwise, repeat step 2   Successively pop vertices off the stack and color them in the lowest color not used by some neighbor   If some vertex cannot be colored, then pick an uncolored vertex to spill, spill it, and restart at step 1

26 26 The Middle-end: Optimizer

27 27 Principles of Compiler Optimization safety Does applying the transformation change the results of executing the code? profitability Is there a reasonable expectation that applying the transformation will improve the code? opportunity Can we efficiently and frequently find places to apply optimization Optimizing compiler Program Analysis Program Transformation

28 28 Program Analysis Control-flow analysis Data-flow analysis

29 29 Control Flow Analysis Basic blocks Control flow graph Dominator tree Natural loops Dominance frontier the join points for SSA insert Ф node

30 30 Data Flow Analysis “compile-time reasoning about the runtime flow of values” represent effects of each basic block propagate facts around control flow graph

31 31 DFA: The Big Picture Transfer function Forward analysis: compute OUT(B) in terms IN(B) Available expressions Reaching definition Backward analysis: compute IN(B) in terms of OUT(B) Variable liveness Very busy expressions Meet function for join points Forward analysis: combine OUT(p) of predecessors to form IN(B) Backward analysis: combine IN(s) of successors to form OUT(B) Set up a set of equations that relate program properties at different program points in terms of the properties at "nearby" program points

32 32 Available Expression Basic block b IN(b): expressions available at b’s entry IN(b): expressions available at b’s entry OUT(b): expressiongs available at b’s exit OUT(b): expressiongs available at b’s exit Local sets Local sets def(b): expressions defined in b and available on exit def(b): expressions defined in b and available on exit killed(b): expressions killed in b killed(b): expressions killed in b An expression is killed in b if operands are assigned in b An expression is killed in b if operands are assigned in b Transfer function Transfer function OUT(b) = def(b) ∪ (IN(b) – killed(b)) OUT(b) = def(b) ∪ (IN(b) – killed(b)) Meet function Meet function IN(b) = IN(b) =

33 33 More Data Flow Problems AVAIL Equations AVAIL Equations More data flow problems More data flow problems Reaching Definition Reaching Definition Liveness Liveness meet function ∪∩ forward forward reaching reaching definition definition available available expression expression backward backward variable variable liveness liveness very busy very busy expression expression

34 34 Compiler Optimization Local optimization DAG CSE Value numbering Global optimization enabled by DFA Global CSE (AVAIL) Constant propagation (Def-Use) Dead code elimination (Use-Def) Advanced topic: SSA

35 35 Perspective Front end: essentially solved problem Middle end: domain-specific language Back end: new architecture Verifying compiler, reliability, security

36 36 Interesting Stuff We Skipped Interprocedural analysis Alias (pointer) analysis Garbage collection Check the literature reference in EaC

37 37 How will you use the knowledge? As informed programmer As informed small language designer As informed hardware engineer As compiler writer

38 38 Informed Programmer “Knowledge is power” Compiler is no longer a black box Know how compiler works Implications Use of language features Avoid those can cause problem Give compiler hints Code optimization Don’t optimize prematurely Don’t write complicated code Debugging Understand the compiled code

39 39 Solving Problem the Compiler Way Solve problems from language/compiler perspective Implement simple language Extend language

40 40 Informed Hardware Engineer Compiler support for programmable hardware pervasive computing new back-ends for new processors Design new architectures what can compiler do and not do how to expose and use compiler to manage hardware resources

41 41 Compiler Writer Make a living by writing compilers! Theory Algorithms Engineering We have built: scanner parser AST tree builder, type checker register allocator instruction scheduler Used compiler generation tools ANTLR, lex, yacc, etc On track to jump into compiler development!

42 42 Final Remarks Compiler construction Theory Implementation How to use what you learned in this lecture? As informed programmer As informed small language designer As informed hardware engineer As compiler writer … and live happily ever after


Download ppt "1 Compiler Construction Overview. 2 Today’s Goals Summary of the subjects we’ve covered Perspectives and final remarks."

Similar presentations


Ads by Google