Chapter 7 Evaluating the Instruction Set Architecture of H1: Part 1.

Slides:



Advertisements
Similar presentations
Chapter 16 Java Virtual Machine. To compile a java program in Simple.java, enter javac Simple.java javac outputs Simple.class, a file that contains bytecode.
Advertisements

The University of Adelaide, School of Computer Science
CPU Review and Programming Models CT101 – Computing Systems.
Chapter 10- Instruction set architectures
MIPS Assembly Language Programming
Ch. 8 Functions.
Procedures II (1) Fall 2005 Lecture 07: Procedure Calls (Part 2)
Functions Functions and Parameters. History A function call needs to save the registers in use The called function will use the registers The registers.
CS 326 Programming Languages, Concepts and Implementation Instructor: Mircea Nicolescu Lecture 18.
Chapter 8 Runtime Support. How program structures are implemented in a computer memory? The evolution of programming language design has led to the creation.
COMP3221: Microprocessors and Embedded Systems Lecture 12: Functions I Lecturer: Hui Wu Session 2, 2005.
Bitwise Operators Pointers Functions Structs Interrupts (in C)
Chapter 9. 2 Objectives You should be able to describe: Addresses and Pointers Array Names as Pointers Pointer Arithmetic Passing Addresses Common Programming.
Procedures in more detail. CMPE12cCyrus Bazeghi 2 Procedures Why use procedures? Reuse of code More readable Less code Microprocessors (and assembly languages)
1 Chapter 7: Runtime Environments. int * larger (int a, int b) { if (a > b) return &a; //wrong else return &b; //wrong } int * larger (int *a, int *b)
Memory Allocation. Three kinds of memory Fixed memory Stack memory Heap memory.
Run-Time Storage Organization
1 Pertemuan 20 Run-Time Environment Matakuliah: T0174 / Teknik Kompilasi Tahun: 2005 Versi: 1/6.
Run time vs. Compile time
Chapter 8 Evaluating the Instruction Set Architecture of H1: Part 2.
Chapter 6. 2 Objectives You should be able to describe: Function and Parameter Declarations Returning a Single Value Pass by Reference Variable Scope.
Chapter 4 H1 Assembly Language: Part 2. Direct instruction Contains the absolute address of the memory location it accesses. ld instruction:
Chapter 16 Java Virtual Machine. To compile a java program in Simple.java, enter javac Simple.java javac outputs Simple.class, a file that contains bytecode.
Run-time Environment and Program Organization
1 Run time vs. Compile time The compiler must generate code to handle issues that arise at run time Representation of various data types Procedure linkage.
1 Contents. 2 Run-Time Storage Organization 3 Static Allocation In many early languages, notably assembly and FORTRAN, all storage allocation is static.
C Programming for Embedded Systems. fig_06_00 Computer Layers Low-level hardware to high-level software (4GL: “domain-specific”, report-driven, e.g.)
Macro & Function. Function consumes more time When a function is called, the copy of the arguments are passed to the parameters in the function. After.
Overview Working directly with memory locations is beneficial. In C, pointers allow you to: change values passed as arguments to functions work directly.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 13: Pointers, Classes, Virtual Functions, and Abstract Classes.
Chapter 7: Runtime Environment –Run time memory organization. We need to use memory to store: –code –static data (global variables) –dynamic data objects.
Pointer Data Type and Pointer Variables
C++ Programming: From Problem Analysis to Program Design, Fourth Edition Chapter 14: Pointers, Classes, Virtual Functions, and Abstract Classes.
Universal Concepts of Programming Creating and Initializing local variables on the stack Variable Scope and Lifetime Stack Parameters Stack Frames Passing.
1 Stacks Chapter 4 2 Introduction Consider a program to model a switching yard –Has main line and siding –Cars may be shunted, removed at any time.
A First Book of C++: From Here To There, Third Edition2 Objectives You should be able to describe: Function and Parameter Declarations Returning a Single.
Runtime Environments Compiler Construction Chapter 7.
Compiler Construction
CSC 3210 Computer Organization and Programming Chapter 1 THE COMPUTER D.M. Rasanjalee Himali.
5-1 Chapter 5 - Languages and the Machine Department of Information Technology, Radford University ITEC 352 Computer Organization Principles of Computer.
Lesson 13 CDT301 – Compiler Theory, Spring 2011 Teacher: Linus Källberg.
COP4020 Programming Languages Subroutines and Parameter Passing Prof. Xin Yuan.
5-1 Chapter 5 - Languages and the Machine Principles of Computer Architecture by M. Murdocca and V. Heuring © 1999 M. Murdocca and V. Heuring Principles.
Chapter 12: Pointers, Classes, Virtual Functions, and Abstract Classes.
Execution of an instruction
RUN-Time Organization Compiler phase— Before writing a code generator, we must decide how to marshal the resources of the target machine (instructions,
Lecture 3 Classes, Structs, Enums Passing by reference and value Arrays.
Processes CS 6560: Operating Systems Design. 2 Von Neuman Model Both text (program) and data reside in memory Execution cycle Fetch instruction Decode.
A FIRST BOOK OF C++ CHAPTER 6 MODULARITY USING FUNCTIONS.
CSC 8505 Compiler Construction Runtime Environments.
Programming Languages and Paradigms Activation Records in Java.
Programming Fundamentals. Topics to be covered Today Recursion Inline Functions Scope and Storage Class A simple class Constructor Destructor.
ICOM 4035 – Data Structures Dr. Manuel Rodríguez Martínez Electrical and Computer Engineering Department Lecture 2 – August 23, 2001.
RUNTIME ENVIRONMENT AND VARIABLE BINDINGS How to manage local variables.
Calling Procedures C calling conventions. Outline Procedures Procedure call mechanism Passing parameters Local variable storage C-Style procedures Recursion.
Preocedures A closer look at procedures. Outline Procedures Procedure call mechanism Passing parameters Local variable storage C-Style procedures Recursion.
LECTURE 3 Translation. PROCESS MEMORY There are four general areas of memory in a process. The text area contains the instructions for the application.
LECTURE 19 Subroutines and Parameter Passing. ABSTRACTION Recall: Abstraction is the process by which we can hide larger or more complex code fragments.
RealTimeSystems Lab Jong-Koo, Lim
Design issues for Object-Oriented Languages
Lecture 3 Translation.
Writing Functions in Assembly
ENERGY 211 / CME 211 Lecture 25 November 17, 2008.
Computer Architecture and Organization Miles Murdocca and Vincent Heuring Chapter 4 – The Instruction Set Architecture.
Writing Functions in Assembly
Stacks Chapter 4.
Chapter 7 Subroutines Dr. A.P. Preethy
Understanding Program Address Space
MARIE: An Introduction to a Simple Computer
Presentation transcript:

Chapter 7 Evaluating the Instruction Set Architecture of H1: Part 1

We will study the assembly code generated by a “dumb” C++ compiler Will provide a sense of what constitutes a good architecture. Will provide a better understanding of high-level languages, such as C++ and Java.

Dumb compiler Translates each statement in isolation of the statements that precede and follow it. Dumb compiler lacks all non-essential capabilities, except for a few that would minimally impact its complexity.

Dump compiler translates x = 2; y = x; to ldc 2 st x ld x ; this ld is unnecessary st y Compiler does not remember what it did for the first instruction when it translates the second.

A smart compiler would translate x = 2; y = x; to ldc 2 st x st y Smart compiler is able to generate more efficient code for the second instruction by remembering what it did for the first instruction.

Dumb compiler generates code left to right

Dumb compiler follows order required by operator precedence and associativity. * has higher precedence than + = is right associative. d = a + b*c; // code for b* c first a = b = c = 5; // code for c = 5 first

When dumb compiler generates code for a binary operator, it accesses operands in the order that yields better code.

Dumb compiler always follows order imposed by parentheses v = (w + x) + ( y + z); 1 st 3 rd 2 nd Other orders might yield more efficient code.

Dumb compiler performs constant folding (compile-time evaluation of constant expressions)‏ Run-time evaluation Compile-time evaluation

Constant folding only on constants at the beginning of an expression. y = x; // constant folding y = 1 + x + 2; // no constant folding y = x ; // no constant folding

Global variables in C++ Declared outside of a function definition. Scope is from point of declaration to end of file (excluding functions with identically named local variable). Default to an initial value of 0 if an initial value is not explicitly specified. Are translated to a dw statement in assembler code.

x and y in preceding slide are translated to x: dw 0 ; 0 default value y: dw 3

Direct instructions are used to access global variables x = 5; is translated to ldc 5 st x ; use direct instruction

Global variables are part of the machine language program code globals free memory (The Heap) stack

Compiler generated names start is the label generated for the constant is the label generated for the are for string in the next slide.

A local variable can be referenced by name only in the function or sub-block within the function in which it is defined. Two types of local variables: dynamic and static. Static local variables are defined with a dw statement. Direct instructions are used to access them. Created at assembly time. Dynamic local variables are created on the stack at run time. Relative instructions are used to access them.

Output from preceding program

sla, slb translated to

How do local static variables work? void foo() { static int x = 17; static int y;... } foo:... main:... call foo... call dw: 0 stack foo() can be called from anywhere inside main(). No code in foo() attempts to declare or initialize the static variable x. Initialization was done when the dw directive was assembled.

Why are the names of static local variables not carried over as is (as with global variables) to assembler code? See the next slide.

A dynamic local variable is allocated on the stack with a push or aloc instruction. push is used if the variable has to be initialized. aloc is used otherwise.

void fc () { int dla; int dlb = 7;... return }

How Do You Return From a Function Call? In executing call foo, the first thing that happens is pc++ as part of fetch. So pc now points to the first instruction after the call instruction. Then sp-- occurs, adding space to the stack Then new pc value is written to mem[sp]

Microcode for call ///////////////////////////////// CALL /////////////////////////////////////// sp = sp -1 / open space on the stack mar = sp; mdr = pc; / set up stack for a write instruction / data to be written is the address / immediately following call pc = ir & xmask; wr; goto fetch / prepare pc for jump to call address / write to stack and go to next inst

The relative addresses of dla and dlb are 1 and 0, respectively. See the preceding slide.

The default initialization of global and static local variables (to 0) does not require additional time or space. Why? But the default initialization of dynamic local variables would require extra time and space (a ld or ldc followed by a push versus a single aloc).

x and y initially 0; z initially has garbage: why?

Default initializations are often not used, in which case the extra time and space required for the initialization of dynamic local variables would be wasted.

Relative addresses can change during the execution of a function Changing relative addresses makes compiling a more difficult task—the compiler must keep track of the correct relative address. Changing relative addresses is a frequent source of bugs in hand-written assembly code for the H1.

Relative address of x changes

Call by value The value (as opposed to the address) of the argument is passed to a function. An argument and its corresponding parameter are distinct variables, occupying different memory locations. Parameters are created by the calling function on function call; destroyed by the calling function on function return.

m is the argument; x is its parameter; y is a local variable stop here code m: 5 fg frame sp x: 5 ret addr y: 7

The parameter x comes into existence during the call of fg. x is created on the stack by pushing the value of its corresponding argument (m). Creating parameters

ld m push ; creates x... Start of function call fg(m) Push value or argument, thereby creating the parameter.

call fg Call the function (which pushes return address on the stack)‏

Allocate uninitialized local variable with aloc aloc 1

Execute function body ldc 7 ; y = 7; str 0 ldc 2 ; x = y + 2; addr 0 str 2 ; location of x

Deallocate dynamic local variable dloc 1

Return to calling function (which pops the return address)‏ ret

Deallocate parameter dloc 1

Back to initial configuration

Memory management

The calling function creates and destroys parameters. ld m push ; create parameter x call fg dloc 1 ; destroy parameter x Parameter creation/destruction

Dynamic local variable creation/destruction The called function creates and destroys dynamic local variables. aloc 1 ; create y.. dloc 1 ; destroy y

Push arguments right to left void fh(int x, int y, int z) { … }

On entry into fh x, y, and z have relative addresses 1, 2, and 3

The C++ return statement returns values via the ac register.

Using the value returned in the ac

Simple Example: int x = 4; int z; int fa(int a) { int y = 3; a = a + y; return a; } void main( ) { z = fa(x); } fa: ldc 3 push ldr 2 addr 0 str 0 ldr 0 dloc 1 ret main: ld x push call fa dloc 1 st z halt x:dw 4 z: dw 0 end main 4 ret addr 3 sp 4 x 0 z

Is it possible to replace relative instructions with direct instructions? Answer: sometimes

It is possible to replace relative instructions accessing x, y, z with direct instructions in this program.

Can use either an absolute address or its corresponding relative address relative address absolute address x 2 FFE y 3 FFF z 0 FFC

Stack frame: the collection of parameters, return address and local variables allocated on the stack during a function call and execution. A stack frame is sometimes called an activation record or activation stack frame.

But for the following program, the absolute addresses of x, y, and z are different during the two calls of sum. But the relative addresses are the same.

Stack frame for 1 st call of sum

Stack frame for 2 nd call of sum

Determining the addresses of globals and static locals is easy, but difficult for dynamic locals. Another problem with H1: determining the absolute address of items on the stack.

Easy

Also easy

Relative address of x below is 1. What is its absolute address? It varies! Thus, &x must be computed at run time by converting its relative address to its corresponding absolute address.

Absolute address of x different for 1 st and 2 nd calls of fm but relative address is the same (1).

To convert relative address 1 to absolute address: swap ; corrupts sp swap ; restores sp ldc 1 dw 0

It is dangerous to corrupt the sp register on most computers, even for a short period of time. An interrupt mechanism uses the stack. Interrupts can occur at any time. It the sp is in a corrupted state when an interrupt occurs, a system failure will result. H1 does not have an interrupt mechanism.

Dereferencing pointers Code generated depends on the type of the variable pointed to. That is why you have to specify the type of the pointed-to item when declaring a pointer.

int *p; // p points to one-word number long *q; // q points to two-word number ; mem[ac] = mem[sp++]