Topic #10: Optimization EE 456 – Compiling Techniques Prof. Carl Sable Fall 2003.

Slides:



Advertisements
Similar presentations
Example of Constructing the DAG (1)t 1 := 4 * iStep (1):create node 4 and i 0 Step (2):create node Step (3):attach identifier t 1 (2)t 2 := a[t 1 ]Step.
Advertisements

CSC 4181 Compiler Construction Code Generation & Optimization.
Synopsys University Courseware Copyright © 2012 Synopsys, Inc. All rights reserved. Compiler Optimization and Code Generation Lecture - 3 Developed By:
1 Optimization Optimization = transformation that improves the performance of the target code Optimization must not change the output must not cause errors.
Intermediate Code Generation
7. Optimization Prof. O. Nierstrasz Lecture notes by Marcus Denker.
Course Outline Traditional Static Program Analysis Software Testing
Lecture 11: Code Optimization CS 540 George Mason University.
Chapter 9 Code optimization Section 0 overview 1.Position of code optimizer 2.Purpose of code optimizer to get better efficiency –Run faster –Take less.
CMPUT Compiler Design and Optimization1 CMPUT680 - Winter 2006 Topic 5: Peep Hole Optimization José Nelson Amaral
ECE 454 Computer Systems Programming Compiler and Optimization (I) Ding Yuan ECE Dept., University of Toronto
Computer Architecture Lecture 7 Compiler Considerations and Optimizations.
1 Chapter 8: Code Generation. 2 Generating Instructions from Three-address Code Example: D = (A*B)+C =* A B T1 =+ T1 C T2 = T2 D.
Code optimization: –A transformation to a program to make it run faster and/or take up less space –Optimization should be safe, preserve the meaning of.
Chapter 10 Code Optimization. A main goal is to achieve a better performance Front End Code Gen Intermediate Code source Code target Code user Machine-
C Chuen-Liang Chen, NTUCS&IE / 321 OPTIMIZATION Chuen-Liang Chen Department of Computer Science and Information Engineering National Taiwan University.
1 Code Optimization Code produced by compilation algorithms can often be improved (ideally optimized) in terms of run-time speed and the amount of memory.
1 CS 201 Compiler Construction Lecture 5 Code Optimizations: Copy Propagation & Elimination.
Peephole Optimization Final pass over generated code: examine a few consecutive instructions: 2 to 4 See if an obvious replacement is possible: store/load.
Improving code generation. Better code generation requires greater context Over expressions: optimal ordering of subtrees Over basic blocks: Common subexpression.
CS 536 Spring Intermediate Code. Local Optimizations. Lecture 22.
4/23/09Prof. Hilfinger CS 164 Lecture 261 IL for Arrays & Local Optimizations Lecture 26 (Adapted from notes by R. Bodik and G. Necula)
9. Optimization Marcus Denker. 2 © Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations.
Code Generation Professor Yihjia Tsai Tamkang University.
Prof. Bodik CS 164 Lecture 171 Register Allocation Lecture 19.
Register Allocation (via graph coloring)
2015/6/24\course\cpeg421-10F\Topic1-b.ppt1 Topic 1b: Flow Analysis Some slides come from Prof. J. N. Amaral
U NIVERSITY OF M ASSACHUSETTS, A MHERST D EPARTMENT OF C OMPUTER S CIENCE Emery Berger University of Massachusetts, Amherst Advanced Compilers CMPSCI 710.
Intermediate Code. Local Optimizations
Improving Code Generation Honors Compilers April 16 th 2002.
Prof. Fateman CS164 Lecture 211 Local Optimizations Lecture 21.
Compiler Construction A Compulsory Module for Students in Computer Science Department Faculty of IT / Al – Al Bayt University Second Semester 2008/2009.
Machine-Independent Optimizations Ⅰ CS308 Compiler Theory1.
PSUCS322 HM 1 Languages and Compiler Design II IR Code Optimization Material provided by Prof. Jingke Li Stolen with pride and modified by Herb Mayer PSU.
Optimizing Compilers Nai-Wei Lin Department of Computer Science and Information Engineering National Chung Cheng University.
Compiler Code Optimizations. Introduction Introduction Optimized codeOptimized code Executes faster Executes faster efficient memory usage efficient memory.
Introduction For some compiler, the intermediate code is a pseudo code of a virtual machine. Interpreter of the virtual machine is invoked to execute the.
What’s in an optimizing compiler?
1 Code Generation Part II Chapter 8 (1 st ed. Ch.9) COP5621 Compiler Construction Copyright Robert van Engelen, Florida State University,
CSc 453 Final Code Generation Saumya Debray The University of Arizona Tucson.
1 Code Generation Part II Chapter 9 COP5621 Compiler Construction Copyright Robert van Engelen, Florida State University, 2005.
Code Optimization 1 Course Overview PART I: overview material 1Introduction 2Language processors (tombstone diagrams, bootstrapping) 3Architecture of a.
CH10.1 CSE 4100 Chap 10: Optimization Prof. Steven A. Demurjian Computer Science & Engineering Department The University of Connecticut 371 Fairfield Way,
CPSC 388 – Compiler Design and Construction Optimization.
1 Code optimization “Code optimization refers to the techniques used by the compiler to improve the execution efficiency of the generated object code”
Compiler Principles Fall Compiler Principles Lecture 0: Local Optimizations Roman Manevich Ben-Gurion University.
Chapter 10 Code Optimization Zhang Jing, Wang HaiLing College of Computer Science & Technology Harbin Engineering University.
Compiler Optimizations ECE 454 Computer Systems Programming Topics: The Role of the Compiler Common Compiler (Automatic) Code Optimizations Cristiana Amza.
3/2/2016© Hal Perkins & UW CSES-1 CSE P 501 – Compilers Optimizing Transformations Hal Perkins Autumn 2009.
CS412/413 Introduction to Compilers and Translators April 2, 1999 Lecture 24: Introduction to Optimization.
©SoftMoore ConsultingSlide 1 Code Optimization. ©SoftMoore ConsultingSlide 2 Code Optimization Code generation techniques and transformations that result.
More Code Generation and Optimization Pat Morin COMP 3002.
Code Optimization Code produced by compilation algorithms can often be improved (ideally optimized) in terms of run-time speed and the amount of memory.
Code Optimization Overview and Examples
High-level optimization Jakub Yaghob
Code Optimization.
Material for course thanks to:
Optimization Code Optimization ©SoftMoore Consulting.
Princeton University Spring 2016
Machine-Independent Optimization
Optimizing Transformations Hal Perkins Autumn 2011
Unit IV Code Generation
Optimizing Transformations Hal Perkins Winter 2008
Compiler Code Optimizations
Code Optimization Overview and Examples Control Flow Graph
8 Code Generation Topics A simple code generator algorithm
Optimization 薛智文 (textbook ch# 9) 薛智文 96 Spring.
Intermediate Code Generation
Code Generation Part II
Code Optimization.
Presentation transcript:

Topic #10: Optimization EE 456 – Compiling Techniques Prof. Carl Sable Fall 2003

Code Optimization and Phases

Optimization Code produced by standard algorithms can often be made to run faster, take less space or both These improvements are achieved through transformations called optimization Compilers that apply these transformations are called optimizing compilers It is especially important to optimize frequently executed parts of a program –The 90/10 rule –Profiling can be helpful, but compilers do not typically have sample input data –Inner loops tend to be good candidates for optimization

Criteria for Transformations A transformation must preserve the meaning of a program –Can not change the output produced for any input –Can not introduce an error Transformations should, on average, speed up programs Transformations should be worth the effort

Beyond Optimizing Compilers Really improvements can be made at various phases Source code: –Algorithmic transformations can produce spectacular improvements –Profiling can be helpful to focus a programmer's attention on important code Intermediate code: –Compiler can improve loops, procedure calls, and address calculations –Typically only optimizing compilers include this phase Target code: –Compilers can use registers efficiently –Peephole transformation can be applied

Peephole Optimizations A simple technique for locally improving target code (can also be applied to intermediate code) The peephole is a small, moving window on the target program Each improvement replaces the instructions of the peephole with a shorter or faster sequence Each improvement may create opportunities for additional improvements Repeated passes may be necessary

Redundant-Instruction Elimination Redundant loads and stores: (1)MOV R0, a (2)MOV a, R0 Unreachable code: #define debug 0 if (debug) { /* print debugging information */ }

Flow-of-Control Optimizations Jumps to jumps, jumps to conditional jumps, and conditional jumps to jumps are not necessary Jumps to jumps example: –The following replacement is valid: –If there are no other jumps to L1 and L1 is preceded by an unconditional jump, the statement at L1 can be eliminated Jumps to conditional jumps and conditional jumps to jumps lead to similar transformations goto L1 … L1: goto L2 goto L2 … L1: goto L2

Other Peephole Optimizations A few algebraic identities that occur frequently (such as x := x + 0 or x := x * 1 ) can be eliminated Reduction in strength replaces expensive operations with cheaper ones –Calculating x * x is likely much cheaper than x^2 using an exponentiation routine –It may be cheaper to implement x * 5 as x * 4 + x Some machines may have hardware instructions to implement certain specific operations efficiently –For example, auto-increment may be cheaper than a straight- forward x := x + 1 –Auto-increment and auto-decrement are also useful when pushing into or popping off of a stack

Optimizing Intermediate Code This phase is generally only included in optimizing compilers Offers the following advantages: –Operations needed to implement high-level constructs are made explicit (i.e. address calculations) –Intermediate code is independent of target machine; code generator can be replaced for different machine We are assuming intermediate code uses three- address instructions

Quicksort in C void quicksort(int m, int n) { int i, j, v, x; if (n <= m) return; /* Start of partition code */ i = m-1; j = n; v =a[n]; while (1) { do i = i+1; while (a[i] < v); do j = j-1; while (a[j] > v); if (i >= j) break; x = a[i]; a[i] = a[j]; a[j] = x; } x = a[i]; a[i] = a[n]; a[n] = x; /* End of partition code */ quicksort(m, j); quicksort(i+1, n); }

Partition in Three-Address Code (1) i := m-1 (2) j := n (3) t1 := 4*n (4) v := a[t1] (5) i := i+1 (6) t2 := 4*i (7) t3 := a[t2] (8) if t3 < v goto (5) (9) j := j-1 (10) t4 := 4*j (11) t5 := a[t4] (12) if t5 > v goto (9) (13) if i >= j goto (23) (14) t6 := 4*i (15) x := a[t6] (16) t7 := 4*i (17) t8 := 4*j (18) t9 := a[t8] (19) a[t7] := t9 (20) t10 := 4*j (21) a[t10] := x (22) goto (5) (23) t11 := 4*i (24) x := a[t11] (25) t12 := 4*i (26) t13 := 4*n (27) t14 := a[t13] (28) a[t12] := t14 (29) t15 := 4*n (30) a[t15] := x

Partition Flow Graph

Local vs. Global Transformations Local transformations involve statements within a single basic block All other transformations are called global transformations Local transformations are generally performed first Many types of transformations can be performed either locally or globally

Common Subexpressions E is a common subexpression if: –E was previously computed –Variables in E have not changed since previous computation Can avoid recomputing E if previously computed value is still available Dags are useful to detect common subexpressions

Local Common Subexpressions t6 := 4*i x := a[t6] t7 := 4*i t8 := 4*j t9 := a[t8] a[t7] := t9 t10 := 4*j a[t10] := x goto B2 t6 := 4*i x := a[t6] t8 := 4*j t9 := a[t8] a[t6] := t9 a[t8] := x goto B2

Global Common Subexpressions

Copy Propagation Assignments of the form f := g are called copy statements (or copies) The idea behind copy propagation is to use g for f whenever possible after such a statement For example, applied to block B5 of the previous flow graph, we obtain: x := t3 a[t2] := t5 a[t4] := t3 goto B2 Copy propagation often turns the copy statement into "dead code"

Dead-Code Elimination Dead code includes code that can never be reached and code that computes a value that never gets used Consider: if (debug) print … –It can sometimes be deduced at compile time that the value of an expression is constant –Then the constant can be used in place of the expression (constant folding) –Let's assume a previous statement assigns debug := false and value never changes –Then the print statement becomes unreachable and can be eliminated Consider the example from the previous slide –The value of x computed by the copy statement never gets used after the copy propagation –The copy statement is now dead code and can be eliminated

Loop Optimizations (1) The running time of a program may be improved if we do both of the following: –Decrease the number of statements in an inner loop –Increase the number of statements in the outer loop Code motion moves code outside of a loop –For example, consider: while (i <= limit-2) … –The result of code motion would be: t = limit – 2 while (i <= t) …

Loop Optimizations (2) Induction variables: variables that remain in "lock-step" –For example, in block B3 of previous flow graph, j and t4 are induction variables –Induction-variable elimination can sometimes eliminate all but one of a set of induction variables Reduction in strength replaces a more expensive operation with a less expensive one –For example, in block B3, t4 decreases by four with every iteration –If initialized correctly, can replace multiplication with subtraction –Often application of reduction in strength leads to induction- variable elimination Methods exist to recognize induction variables and apply appropriate transformations automatically

Loop Optimization Example

Implementing a Code Optimizer Organization consists of control-flow analysis, then data- flow analysis, and then transformations The code generator is applied to the transformed intermediate code Details of code optimization are beyond the scope of this course