Memory Management Chapter FourteenModern Programming Languages, 2nd ed.1.

Slides:



Advertisements
Similar presentations
Chapter 22 Implementing lists: linked implementations.
Advertisements

Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 11 Memory Management C makes it easy to shoot.
Singly linked lists Doubly linked lists
Pointers.
Dynamic Allocation and Linked Lists. Dynamic memory allocation in C C uses the functions malloc() and free() to implement dynamic allocation. malloc is.
Singly Linked Lists What is a singly-linked list? Why linked lists?
Chapter 3 – Lists A list is just what the name implies, a finite, ordered sequence of items. Order indicates each item has a position. A list of size 0.
LIST PROCESSING.
CHP-5 LinkedList.
Dynamic memory allocation
Lectures 10 & 11.
Dynamic Memory Allocation in C.  What is Memory What is Memory  Memory Allocation in C Memory Allocation in C  Difference b\w static memory allocation.
Dynamic Memory Management
Carnegie Mellon 1 Dynamic Memory Allocation: Basic Concepts : Introduction to Computer Systems 17 th Lecture, Oct. 21, 2010 Instructors: Randy Bryant.
Chapter 6 Data Types
Chris Riesbeck, Fall 2007 Dynamic Memory Allocation Today Dynamic memory allocation – mechanisms & policies Memory bugs.
Lecture 10: Heap Management CS 540 GMU Spring 2009.
Chapter FourteenModern Programming Languages1 Memory Management.
Compiler construction in4020 – lecture 12 Koen Langendoen Delft University of Technology The Netherlands.
5. Memory Management From: Chapter 5, Modern Compiler Design, by Dick Grunt et al.
Run-time organization  Data representation  Storage organization: –stack –heap –garbage collection Programming Languages 3 © 2012 David A Watt,
Memory allocation CSE 2451 Matt Boggus. sizeof The sizeof unary operator will return the number of bytes reserved for a variable or data type. Determine:
Chapter 1 Object Oriented Programming. OOP revolves around the concept of an objects. Objects are crated using the class definition. Programming techniques.
CS 326 Programming Languages, Concepts and Implementation Instructor: Mircea Nicolescu Lecture 18.
Chapter 8 Runtime Support. How program structures are implemented in a computer memory? The evolution of programming language design has led to the creation.
Stacks and HeapsCS-3013 A-term A Short Digression on Stacks and Heaps CS-3013 Operating Systems A-term 2008 (Slides include materials from Modern.
Digression on Stack and Heaps CS-502 (EMC) Fall A Short Digression on Stacks and Heaps CS-502, Operating Systems Fall 2009 (EMC) (Slides include.
Memory Allocation. Three kinds of memory Fixed memory Stack memory Heap memory.
Chapter 10 Storage Management Implementation details beyond programmer’s control Storage/CPU time trade-off Binding times to storage.
The environment of the computation Declarations introduce names that denote entities. At execution-time, entities are bound to values or to locations:
1 CS 201 Dynamic Data Structures Debzani Deb. 2 Run time memory layout When a program is loaded into memory, it is organized into four areas of memory.
Run-time Environment and Program Organization
1 Run time vs. Compile time The compiler must generate code to handle issues that arise at run time Representation of various data types Procedure linkage.
Stacks and HeapsCS-502 Fall A Short Digression Stacks and Heaps CS-502, Operating Systems Fall 2007 (Slides include materials from Operating System.
1 Chapter 5: Names, Bindings and Scopes Lionel Williams Jr. and Victoria Yan CSci 210, Advanced Software Paradigms September 26, 2010.
Chapter 7: Runtime Environment –Run time memory organization. We need to use memory to store: –code –static data (global variables) –dynamic data objects.
Review 1 Introduction Representation of Linear Array In Memory Operations on linear Arrays Traverse Insert Delete Example.
CS3012: Formal Languages and Compilers The Runtime Environment After the analysis phases are complete, the compiler must generate executable code. The.
Introduction to Data Structures Systems Programming.
Compiler Construction
Pointers and Arrays Beyond Chapter Pointers and Arrays What are the real differences? Pointer Holds the address of a variable Can be pointed.
Stack and Heap Memory Stack resident variables include:
Dynamic Memory Allocation The process of allocating memory at run time is known as dynamic memory allocation. C does not Inherently have this facility,
1 Object Oriented Programming Lecture IX Some notes on Java Performance with aspects on execution time and memory consumption.
Basic Semantics Associating meaning with language entities.
University of Washington Today Finished up virtual memory On to memory allocation Lab 3 grades up HW 4 up later today. Lab 5 out (this afternoon): time.
Lists, Stacks and Queues in C Yang Zhengwei CSCI2100B Data Structures Tutorial 4.
Chapter FourteenModern Programming Languages1 Memory Management.
CS 326 Programming Languages, Concepts and Implementation Instructor: Mircea Nicolescu Lecture 9.
RUN-Time Organization Compiler phase— Before writing a code generator, we must decide how to marshal the resources of the target machine (instructions,
11/26/2015IT 3271 Memory Management (Ch 14) n Dynamic memory allocation Language systems provide an important hidden player: Runtime memory manager – Activation.
Pointers in C Computer Organization I 1 August 2009 © McQuain, Feng & Ribbens Memory and Addresses Memory is just a sequence of byte-sized.
+ Dynamic memory allocation. + Introduction We often face situations in programming where the data is dynamics in nature. Consider a list of customers.
Operating Systems ECE344 Ashvin Goel ECE University of Toronto Memory Management Overview.
RUNTIME ENVIRONMENT AND VARIABLE BINDINGS How to manage local variables.
6-1 Embedded Systems C Programming Language Review and Dissection IV Lecture 6.
Constructs for Data Organization and Program Control, Scope, Binding, and Parameter Passing. Expression Evaluation.
MORE POINTERS Plus: Memory Allocation Heap versus Stack.
Dynamic Memory Management Jennifer Rexford 1. 2 Goals of this Lecture Dynamic memory management techniques Garbage collection by the run-time system (Java)
1 Compiler Construction Run-time Environments,. 2 Run-Time Environments (Chapter 7) Continued: Access to No-local Names.
Sections Basic Data Structures. 1.5 Data Structures The way you view and structure the data that your programs manipulate greatly influences your.
Dynamic Allocation in C
Data Types In Text: Chapter 6.
Lectures linked lists Chapter 6 of textbook
Dynamic Memory Allocation
Pointers and Dynamic Variables
Dynamic Memory Allocation
RUN-TIME STORAGE Chuen-Liang Chen Department of Computer Science
Run-time environments
SPL – PS2 C++ Memory Handling.
Presentation transcript:

Memory Management Chapter FourteenModern Programming Languages, 2nd ed.1

Dynamic Memory Allocation n Lots of things need memory at runtime: – Activation records – Objects – Explicit allocations: new, malloc, etc. – Implicit allocations: strings, file buffers, arrays with dynamically varying size, etc. n Language systems provide an important hidden player: runtime memory management Chapter FourteenModern Programming Languages, 2nd ed.2

Outline n 14.2 Memory model using Java arrays n 14.3 Stacks n 14.4 Heaps n 14.5 Current heap links n 14.5 Garbage collection Chapter FourteenModern Programming Languages, 2nd ed.3

Memory Model n For now, assume that the OS grants each running program one or more fixed-size regions of memory for dynamic allocation n We will model these regions as Java arrays – To see examples of memory management code – And, for practice with Java Chapter FourteenModern Programming Languages, 2nd ed.4

Declaring An Array n A Java array declaration: n Array types are reference typesan array is really an object, with a little special syntax The variable a above is initialized to null It can hold a reference to an array of int values, but does not yet Chapter FourteenModern Programming Languages, 2nd ed.5 int[] a = null;

Creating An Array Use new to create an array object: n We could have done it with one declaration statement, like this: Chapter FourteenModern Programming Languages, 2nd ed.6 int[] a = null; a = new int[100]; int[] a = new int[100];

Using An Array Use a[i] to refer to an element (as lvalue or rvalue): a is an array reference expression and i is an int expression Use a.length to access length Array indexes are 0..( a.length-1 ) Chapter FourteenModern Programming Languages, 2nd ed.7 int i = 0; while (i<a.length) { a[i] = 5; i++; }

Memory Managers In Java Chapter FourteenModern Programming Languages, 2nd ed.8 public class MemoryManager { private int[] memory; /** * MemoryManager constructor. initialMemory int[] of memory to manage */ public MemoryManager(int[] initialMemory) { memory = initialMemory; } … } We will show Java implementations this way. The initialMemory array is the memory region provided by the operating system.

Outline n 14.2 Memory model using Java arrays n 14.3 Stacks n 14.4 Heaps n 14.5 Current heap links n 14.5 Garbage collection Chapter FourteenModern Programming Languages, 2nd ed.9

Stacks Of Activation Records n For almost all languages, activation records must be allocated dynamically n For many, it suffices to allocate on call and deallocate on return n This produces a stack of activation records: push on call, pop on return n A simple memory management problem Chapter FourteenModern Programming Languages, 2nd ed.10

A Stack Illustration Chapter FourteenModern Programming Languages, 2nd ed.11 An empty stack of 8 words. The stack will grow down, from high addresses to lower addresses. A reserved memory location (perhaps a register) records the address of the lowest allocated word.

Chapter FourteenModern Programming Languages, 2nd ed.12 The program calls m.push(3), which returns 5: the address of the first of the 3 words allocated for an activation record. Memory management uses an extra word to record the previous value of top.

Chapter FourteenModern Programming Languages, 2nd ed.13 The program calls m.push(2), which returns 2: the address of the first of the 2 words allocated for an activation record. The stack is now fullthere is not room even for m.push(1). For m.pop(), just do top = memory[top] to return to previous configuration.

A Java Stack Implementation Chapter FourteenModern Programming Languages, 2nd ed.14 public class StackManager { private int[] memory; // the memory we manage private int top; // index of top stack block /** * StackManager constructor. initialMemory int[] of memory to manage */ public StackManager(int[] initialMemory) { memory = initialMemory; top = memory.length; }

Chapter FourteenModern Programming Languages, 2nd ed.15 /** * Allocate a block and return its address. requestSize int size of block, > 0 block address StackOverflowError if out of stack space */ public int push(int requestSize) { int oldtop = top; top -= (requestSize+1); // extra word for oldtop if (top<0) throw new StackOverflowError(); memory[top] = oldtop; return top+1; } The throw statement and exception handling are introduced in Chapter 17.

Chapter FourteenModern Programming Languages, 2nd ed.16 /** * Pop the top stack frame. This works only if the * stack is not empty. */ public void pop() { top = memory[top]; } }

Outline n 14.2 Memory model using Java arrays n 14.3 Stacks n 14.4 Heaps n 14.5 Current heap links n 14.5 Garbage collection Chapter FourteenModern Programming Languages, 2nd ed.17

The Heap Problem n Stack order makes implementation easy n Not always possible: what if allocations and deallocations can come in any order? n A heap is a pool of blocks of memory, with an interface for unordered runtime memory allocation and deallocation n There are many mechanisms for this… Chapter FourteenModern Programming Languages, 2nd ed.18

First Fit n A linked list of free blocks, initially containing one big free block n To allocate: – Search free list for first adequate block – If there is extra space in the block, return the unused portion at the upper end to the free list – Allocate requested portion (at the lower end) n To free, just add to the front of the free list Chapter FourteenModern Programming Languages, 2nd ed.19

Heap Illustration Chapter FourteenModern Programming Languages, 2nd ed.20 A heap manager m with a memory array of 10 words, initially empty. The link to the head of the free list is held in freeStart. Every block, allocated or free, has its length in its first word. Free blocks have free-list link in their second word, or –1 at the end of the free list.

Chapter FourteenModern Programming Languages, 2nd ed.21 p1=m.allocate(4); p1 will be 1the address of the first of four allocated words. An extra word holds the block length. Remainder of the big free block was returned to the free list.

Chapter FourteenModern Programming Languages, 2nd ed.22 p1=m.allocate(4); p2=m.allocate(2); p2 will be 6the address of the first of two allocated words. An extra word holds the block length. Remainder of the free block was returned to the free list.

Chapter FourteenModern Programming Languages, 2nd ed.23 p1=m.allocate(4); p2=m.allocate(2); m.deallocate(p1); Deallocates the first allocated block. It returns to the head of the free list.

Chapter FourteenModern Programming Languages, 2nd ed.24 p1=m.allocate(4); p2=m.allocate(2); m.deallocate(p1); p3=m.allocate(1); p3 will be 1the address of the allocated word. Notice that there were two suitable blocks. The other one would have been an exact fit. (Best Fit is another possible mechanism.)

A Java Heap Implementation Chapter FourteenModern Programming Languages, 2nd ed.25 public class HeapManager { static private final int NULL = -1; // null link public int[] memory; // the memory we manage private int freeStart; // start of the free list /** * HeapManager constructor. initialMemory int[] of memory to manage */ public HeapManager(int[] initialMemory) { memory = initialMemory; memory[0] = memory.length; // one big free block memory[1] = NULL; // free list ends with it freeStart = 0; // free list starts with it }

Chapter FourteenModern Programming Languages, 2nd ed.26 /** * Allocate a block and return its address. requestSize int size of block, > 0 block address OutOfMemoryError if no block big enough */ public int allocate(int requestSize) { int size = requestSize + 1; // size with header // Do first-fit search: linear search of the free // list for the first block of sufficient size. int p = freeStart; // head of free list int lag = NULL; while (p!=NULL && memory[p]<size) { lag = p; // lag is previous p p = memory[p+1]; // link to next block } if (p==NULL) // no block large enough throw new OutOfMemoryError(); int nextFree = memory[p+1]; // block after p

Chapter FourteenModern Programming Languages, 2nd ed.27 // Now p is the index of a block of sufficient size, // and lag is the index of p's predecessor in the // free list, or NULL, and nextFree is the index of // p's successor in the free list, or NULL. // If the block has more space than we need, carve // out what we need from the front and return the // unused end part to the free list. int unused = memory[p]-size; // extra space if (unused>1) { // if more than a header's worth nextFree = p+size; // index of the unused piece memory[nextFree] = unused; // fill in size memory[nextFree+1] = memory[p+1]; // fill in link memory[p] = size; // reduce p's size accordingly } // Link out the block we are allocating and done. if (lag==NULL) freeStart = nextFree; else memory[lag+1] = nextFree; return p+1; // index of useable word (after header) }

Chapter FourteenModern Programming Languages, 2nd ed.28 /** * Deallocate an allocated block. This works only if * the block address is one that was returned by * allocate and has not yet been deallocated. address int address of the block */ public void deallocate(int address) { int addr = address-1; memory[addr+1] = freeStart; freeStart = addr; } }

A Problem n Consider this sequence: Final allocate will fail: we are breaking up large blocks but never reversing the process n Need to coalesce adjacent free blocks Chapter FourteenModern Programming Languages, 2nd ed.29 p1=m.allocate(4); p2=m.allocate(4); m.deallocate(p1); m.deallocate(p2); p3=m.allocate(7);

A Solution We can implement a smarter deallocate method: – Maintain the free list sorted in address order – When freeing, look at the previous free block and the next free block – If adjacent, coalesce n This is a lot more work than just returning the block to the head of the free list… Chapter FourteenModern Programming Languages, 2nd ed.30

Chapter FourteenModern Programming Languages, 2nd ed.31 /** * Deallocate an allocated block. This works only if * the block address is one that was returned by * allocate and has not yet been deallocated. address int address of the block */ public void deallocate(int address) { int addr = address-1; // real start of the block // Find the insertion point in the sorted free list // for this block. int p = freeStart; int lag = NULL; while (p!=NULL && p<addr) { lag = p; p = memory[p+1]; }

Chapter FourteenModern Programming Languages, 2nd ed.32 // Now p is the index of the block to come after // ours in the free list, or NULL, and lag is the // index of the block to come before ours in the // free list, or NULL. // If the one to come after ours is adjacent to it, // merge it into ours and restore the property // described above. if (addr+memory[addr]==p) { memory[addr] += memory[p]; // add its size to ours p = memory[p+1]; // }

Chapter FourteenModern Programming Languages, 2nd ed.33 if (lag==NULL) { // ours will be first free freeStart = addr; memory[addr+1] = p; } else if (lag+memory[lag]==addr) { // block before is // adjacent to ours memory[lag] += memory[addr]; // merge ours into it memory[lag+1] = p; } else { // neither: just a simple insertion memory[lag+1] = addr; memory[addr+1] = p; } }

Quick Lists n Small blocks tend to be allocated and deallocated much more frequently n A common optimization: keep separate free lists for popular (small) block sizes n On these quick lists, blocks are one size n Delayed coalescing: free blocks on quick lists are not coalesced right away (but may have to be coalesced eventually) Chapter FourteenModern Programming Languages, 2nd ed.34

Fragmentation n When free regions are separated by allocated blocks, so that it is not possible to allocate all of free memory as one block n More generally: any time a heap manager is unable to allocate memory even though free – If it allocated more than requested – If it does not coalesce adjacent free blocks – And so on… Chapter FourteenModern Programming Languages, 2nd ed.35

Chapter FourteenModern Programming Languages, 2nd ed.36 p1=m.allocate(4); p2=m.allocate(1); m.deallocate(p1); p3=m.allocate(5); The final allocation will fail because of fragmentation.

Other Heap Mechanisms n An amazing variety n Three major issues: – Placementwhere to allocate a block – Splittingwhen and how to split large blocks – Coalescingwhen and how to recombine n Many other refinements Chapter FourteenModern Programming Languages, 2nd ed.37

Placement n Where to allocate a block n Our mechanism: first fit from FIFO free list n Some mechanisms use a similar linked list of free blocks: first fit, best fit, next fit, etc. n Some mechanisms use a more scalable data structure like a balanced binary tree Chapter FourteenModern Programming Languages, 2nd ed.38

Splitting n When and how to split large blocks n Our mechanism: split to requested size n Sometimes you get better results with less splittingjust allocate more than requested n A common example: rounding up allocation size to some multiple Chapter FourteenModern Programming Languages, 2nd ed.39

Coalescing n When and how to recombine adjacent free blocks n We saw several varieties: – No coalescing – Eager coalescing – Delayed coalescing (as with quick lists) Chapter FourteenModern Programming Languages, 2nd ed.40

Outline n 14.2 Memory model using Java arrays n 14.3 Stacks n 14.4 Heaps n 14.5 Current heap links n 14.5 Garbage collection Chapter FourteenModern Programming Languages, 2nd ed.41

Current Heap Links n So far, the running program is a black box: a source of allocations and deallocations n What does the running program do with addresses allocated to it? n Some systems track current heap links n A current heap link is a memory location where a value is stored that the running program will use as a heap address Chapter FourteenModern Programming Languages, 2nd ed.42

Tracing Current Heap Links Chapter FourteenModern Programming Languages, 2nd ed.43 IntList a = new IntList(null); int b = 2; int c = 1; a = a.cons(b); a = a.cons(c); Where are the current heap links in this picture?

To Find Current Heap Links n Start with the root set: memory locations outside of the heap with links into the heap – Active activation records (if on the stack) – Static variables, etc. n For each memory location in the set, look at the allocated block it points to, and add all the memory locations in that block n Repeat until no new locations are found Chapter FourteenModern Programming Languages, 2nd ed.44

Discarding Impossible Links n Depending on the language and implementation, we may be able to discard some locations from the set: – If they do not point into allocated heap blocks – If they do not point to allocated heap blocks (Java, but not C) – If their dynamic type rules out use as heap links – If their static type rules out use as heap links (Java, but not C) Chapter FourteenModern Programming Languages, 2nd ed.45

Errors In Current Heap Links n Exclusion errors: a memory location that actually is a current heap link is left out n Unused inclusion errors: a memory location is included, but the program never actually uses the value stored there n Used inclusion errors: a memory location is included, but the program uses the value stored there as something other than a heap addressas an integer, for example Chapter FourteenModern Programming Languages, 2nd ed.46

Errors Are Unavoidable n For heap manager purposes, exclusion errors are unacceptable n We must include a location if it might be used as a heap link n This makes unused inclusion errors unavoidable n Depending on the language, used inclusions may also be unavoidable Chapter FourteenModern Programming Languages, 2nd ed.47

Used Inclusion Errors In C n Static type and runtime value may be of no use in telling how a value will be used Variable x may be used either as a pointer or as an array of four characters Chapter FourteenModern Programming Languages, 2nd ed.48 union { char *p; char tag[4]; } x;

Heap Compaction n One application for current heap links n Manager can move allocated blocks: – Copy the block to a new location – Update all links to (or into) that block n So it can compact the heap, moving all allocated blocks to one end, leaving one big free block and no fragmentation Chapter FourteenModern Programming Languages, 2nd ed.49

Outline n 14.2 Memory model using Java arrays n 14.3 Stacks n 14.4 Heaps n 14.5 Current heap links n 14.5 Garbage collection Chapter FourteenModern Programming Languages, 2nd ed.50

Some Common Pointer Errors Chapter FourteenModern Programming Languages, 2nd ed.51 type p: ^Integer; begin new(p); p^ := 21; dispose(p); p^ := p^ + 1 end procedure Leak; type p: ^Integer; begin new(p) end; Dangling pointer: this Pascal fragment uses a pointer after the block it points to has been deallocated Memory leak: this Pascal procedure allocates a block but forgets to deallocate it

Garbage Collection n Since so many errors are caused by improper deallocation… n …and since it is a burden on the programmer to have to worry about it… n …why not have the language system reclaim blocks automatically? Chapter FourteenModern Programming Languages, 2nd ed.52

Three Major Approaches n Mark and sweep n Copying n Reference counting Chapter FourteenModern Programming Languages, 2nd ed.53

Mark And Sweep n A mark-and-sweep collector uses current heap links in a two-stage process: – Mark: find the live heap links and mark all the heap blocks linked to by them – Sweep: make a pass over the heap and return unmarked blocks to the free pool n Blocks are not moved, so both kinds of inclusion errors are tolerated Chapter FourteenModern Programming Languages, 2nd ed.54

Copying Collection n A copying collector divides memory in half, and uses only one half at a time n When one half becomes full, find live heap links, and copy live blocks to the other half n Compacts as it goes, so fragmentation is eliminated n Moves blocks: cannot tolerate used inclusion errors Chapter FourteenModern Programming Languages, 2nd ed.55

Reference Counting n Each block has a counter of heap links to it n Incremented when a heap link is copied, decremented when a heap link is discarded n When counter goes to zero, block is garbage and can be freed n Does not use current heap links Chapter FourteenModern Programming Languages, 2nd ed.56

Reference Counting Problem Chapter FourteenModern Programming Languages, 2nd ed.57 One problem with reference counting: it misses cycles of garbage. Here, a circularly linked list is pointed to by circle.

Reference Counting Problem Chapter FourteenModern Programming Languages, 2nd ed.58 When circle is set to null, the reference counter is decremented. No reference counter is zero, though all blocks are garbage.

Reference Counting n Problem with cycles of garbage n Problem with performance generally, since the overhead of updating reference counters is high n One advantage: naturally incremental, with no big pause while collecting Chapter FourteenModern Programming Languages, 2nd ed.59

Garbage Collecting Refinements n Generational collectors – Divide block into generations according to age – Garbage collect in younger generations more often (using previous methods) n Incremental collectors – Collect garbage a little at a time – Avoid the uneven performance of ordinary mark-and-sweep and copying collectors Chapter FourteenModern Programming Languages, 2nd ed.60

Garbage Collecting Languages n Some require it: Java, ML n Some encourage it: Ada n Some make it difficult: C, C++ – Even for C and C++ it is possible – There are libraries that replace the usual malloc / free with a garbage-collecting manager Chapter FourteenModern Programming Languages, 2nd ed.61

Trends n An old idea whose popularity is increasing n Good implementations are within a few percent of the performance of systems with explicit deallocation n Programmers who like garbage collection feel that the development and debugging time it saves is worth the runtime it costs Chapter FourteenModern Programming Languages, 2nd ed.62

Conclusion n Memory management is an important hidden player in language systems n Performance and reliability are critical n Different techniques are difficult to compare, since every run of every program makes different memory demands n An active area of language systems research and experimentation Chapter FourteenModern Programming Languages, 2nd ed.63