Overview: Memory Memory Organization: General Issues (Hardware) –Objectives in Memory Design –Memory Types –Memory Hierarchies Memory Management (Software.

Slides:



Advertisements
Similar presentations
Cosc 3P92 Week 9 Lecture slides
Advertisements

Lecture 34: Chapter 5 Today’s topic –Virtual Memories 1.
CMPE 421 Parallel Computer Architecture MEMORY SYSTEM.
1 Lecture 20 – Caching and Virtual Memory  2004 Morgan Kaufmann Publishers Lecture 20 Caches and Virtual Memory.
1  1998 Morgan Kaufmann Publishers Chapter Seven Large and Fast: Exploiting Memory Hierarchy.
1 Chapter Seven Large and Fast: Exploiting Memory Hierarchy.
1 COMP 206: Computer Architecture and Implementation Montek Singh Mon, Oct 31, 2005 Topic: Memory Hierarchy Design (HP3 Ch. 5) (Caches, Main Memory and.
Multiprocessing Memory Management
1 COMP 206: Computer Architecture and Implementation Montek Singh Mon., Nov. 3, 2003 Topic: Memory Hierarchy Design (HP3 Ch. 5) (Caches, Main Memory and.
©Brooks/Cole, 2003 Chapter 5 Computer Organization.
Memory Organization.
IT Systems Memory EN230-1 Justin Champion C208 –
1  2004 Morgan Kaufmann Publishers Chapter Seven.
1 SRAM: –value is stored on a pair of inverting gates –very fast but takes up more space than DRAM (4 to 6 transistors) DRAM: –value is stored as a charge.
CS 524 (Wi 2003/04) - Asim LUMS 1 Cache Basics Adapted from a presentation by Beth Richardson
1 CSE SUNY New Paltz Chapter Seven Exploiting Memory Hierarchy.
CH05 Internal Memory Computer Memory System Overview Semiconductor Main Memory Cache Memory Pentium II and PowerPC Cache Organizations Advanced DRAM Organization.
Memory Sub-System CT101 – Computing Systems. Memory Subsystem Memory Hierarchy Types of memory Memory organization Memory Hierarchy Design Cache.
Computer Orgnization Rabie A. Ramadan Lecture 7. Wired Control Unit What are the states of the following design:
Chapter 6: Memory Memory is organized into a hierarchy
CS1104: Computer Organisation School of Computing National University of Singapore.
GCSE Information Technology Storing data Data storage devices can be divided into 2 main categories: Backing storage is used to store programs and data.
Lecture 14 Memory Hierarchy and Cache Design Prof. Mike Schulte Computer Architecture ECE 201.
Lecture 19 Today’s topics Types of memory Memory hierarchy.
Chapter Twelve Memory Organization
Computer Architecture And Organization UNIT-II Structured Organization.
10/18: Lecture topics Memory Hierarchy –Why it works: Locality –Levels in the hierarchy Cache access –Mapping strategies Cache performance Replacement.
Computer Architecture Memory organization. Types of Memory Cache Memory Serves as a buffer for frequently accessed data Small  High Cost RAM (Main Memory)
3-May-2006cse cache © DW Johnson and University of Washington1 Cache Memory CSE 410, Spring 2006 Computer Systems
+ CS 325: CS Hardware and Software Organization and Architecture Memory Organization.
King Fahd University of Petroleum and Minerals King Fahd University of Petroleum and Minerals Computer Engineering Department Computer Engineering Department.
Copyright © 2007 – Curt Hill Primary Memory and Electronic Storage Implementations.
Computer system & Architecture
1 How will execution time grow with SIZE? int array[SIZE]; int sum = 0; for (int i = 0 ; i < ; ++ i) { for (int j = 0 ; j < SIZE ; ++ j) { sum +=
Memory and Storage. Computer Memory Processor registers – Temporary storage locations within the CPU – Examples Instruction register – holds the instruction.
Introduction: Memory Management 2 Ideally programmers want memory that is large fast non volatile Memory hierarchy small amount of fast, expensive memory.
1 Chapter Seven. 2 Users want large and fast memories! SRAM access times are ns at cost of $100 to $250 per Mbyte. DRAM access times are ns.
Multilevel Caches Microprocessors are getting faster and including a small high speed cache on the same chip.
Cosc 2150: Computer Organization
Memory Hierarchy: Terminology Hit: data appears in some block in the upper level (example: Block X)  Hit Rate : the fraction of memory access found in.
1 Chapter Seven CACHE MEMORY AND VIRTUAL MEMORY. 2 SRAM: –value is stored on a pair of inverting gates –very fast but takes up more space than DRAM (4.
A memory is just like a human brain. It is used to store data and instructions. Computer memory is the storage space in computer where data is to be processed.
1  1998 Morgan Kaufmann Publishers Chapter Seven.
Adapted from Computer Organization and Design, Patterson & Hennessy, UCB ECE232: Hardware Organization and Design Part 14: Memory Hierarchy Chapter 5 (4.
What is it and why do we need it? Chris Ward CS147 10/16/2008.
1 Chapter Seven. 2 SRAM: –value is stored on a pair of inverting gates –very fast but takes up more space than DRAM (4 to 6 transistors) DRAM: –value.
1 How will execution time grow with SIZE? int array[SIZE]; int sum = 0; for (int i = 0 ; i < ; ++ i) { for (int j = 0 ; j < SIZE ; ++ j) { sum +=
Lecture 2 (Memory) Dr. Muhammad Ayaz Computer Organization and Assembly Language. (CSC-210)
Memory Management memory hierarchy programs exhibit locality of reference - non-uniform reference patterns temporal locality - a program that references.
Memory Hierarchy and Cache. A Mystery… Memory Main memory = RAM : Random Access Memory – Read/write – Multiple flavors – DDR SDRAM most common 64 bit.
Computer Architecture Lecture 25 Fasih ur Rehman.
CACHE _View 9/30/ Memory Hierarchy To take advantage of locality principle, computer memory implemented as a memory hierarchy: multiple levels.
CS161 – Design and Architecture of Computer
CMSC 611: Advanced Computer Architecture
Computer Organization
Memory COMPUTER ARCHITECTURE
CS161 – Design and Architecture of Computer
The Goal: illusion of large, fast, cheap memory
How will execution time grow with SIZE?
Lecture 14 Virtual Memory and the Alpha Memory Hierarchy
Chap. 12 Memory Organization
Memory Operation and Performance
Chapter Five Large and Fast: Exploiting Memory Hierarchy
Cache Memory and Performance
Memory Principles.
Overview Problem Solution CPU vs Memory performance imbalance
Presentation transcript:

Overview: Memory Memory Organization: General Issues (Hardware) –Objectives in Memory Design –Memory Types –Memory Hierarchies Memory Management (Software - OS) –Cache –Virtual Memory & Paging –Segmentation Memory Issues in Multiprogramming (Software - OS) –Memory allocation for each process –Swapping –Security

Memory Organization: Objectives High Capacity Fast Access Small Size to fit on chip Cheap Reusable Non-volatile Reliable

Memory Reality Fast Memory is very expensive The bigger memory, the slower it will be Need for memory grows Parkinson’s Law “ Programs expand to fill the memory available to hold them” You can’t run Windows XP on a 5 year old machine with 32 MB

Storage Devices Register Cache Main Memory (RAM) Hard disk Floppy Tape CD ROM …

Memory Types I Electric –Static RAM (Random Access Memory) D-Latch, holds information as long as the circuit has power Fast, Expensive, many transistors –Dynamic RAM One transistor and capacitor per bit Slower, need regular refreshing, cheap, small Magnetic –Hard Drive, Floppy Cheap, slower, stable, error prone

Hard Drive

Memory Types II Optical/Mechanical: CD-Rom, DVD –ROM (Read Only access) –PROM (Programmable ROM) –EPROM (Erasable PROM) – read/write! Cheap, large volume, slow access, stable, high reliability

Type Comparison TypeCategoryErasureVolatileUse SRAMRead/WriteElectricalYesRegister, Cache DRAMRead/WriteElectricalYesMain Memory ROM, PROMRead onlyNo Large Volume Appliances EPROMRead mostlyUV-LightNoDevice Prototyping MagneticRead/WriteMagneticNoHard Drive, Backup tapes

Question: How can you improve performance without higher cost? Answer: Build memory hierarchy Speed Fastest Slowest Cost $/bit Highest Lowest Size Smallest Biggest CPU Memory

Memory Hierarchy

Memory Management: OS Obvious question: What content of Memory should be in what level of memory? How to provide uniform access for all processes to memory content independent of its physical location?

Principle of Locality Locality: –Temporal: You will use again things you used recently –Spatial: Nearby items are more likely to be used (Array) The idea of hierarchy works only because of locality Why does code has locality? –Local variable space –Limited set of variables that will be used again –Arrays –Machine language is stored sequentially

Cache “A safe place for hiding or storing things” Webster Dictionary Small and very fast temporal memory between CPU and main memory that contains things (variables and instructions) that were recently used by CPU

Cache: Simple Example Example: Memory has 8 lines, cache has 4 –LOAD R1, x5 –Since x5 is not in the cache, it will be fetched from main memory and replace one current entry –Cache before request After request X3 X1 X4 X0 X3 X5 X4 X0

Terminology Hit: data requested could be found in one level Miss: Data could not be found -> has to be searched on lower level Miss Rate: percentage of memory access that resulted a miss Block: Minimum amount of data transferred between levels (in the example it was 1) Miss penalty: Additional time it takes to find item

Cache Parameter Block size: How many items do you replace? –Big if high degree of spatial locality –Small if low degree of spatial locality Which block will be discarded in case of a miss? –Replacement strategy: pick one that will hopefully not be used again Separation of data and instruction –Either cache for both or 2 caches, one for each

Unified Data and Instruction Cache

Separate Data and Instruction

Performance Access Time: Miss rate*Miss penalty + Hit time –Miss rate reduction: Bigger cache Bigger blocks Good replacement strategy –Miss penalty reduction Smaller blocks Simple replacement strategy –Hit time reduction Small and simple cache

Replacement Strategy Not recently used –Pick first block that was not used for certain time First in first out –Oldest of all blocks in cache Least recently used –Compare last use time or all blocks and pick the least used –Optimal but very expensive Cheaper approximation of LRU

Cache Entry What information do you need for each item/block? –Associated address –Valid bit (during initialization 0) –Accounting information for replacement: time of last use AddressValidTimeValue

Hit vs. Miss Read hit –Optimal Read miss –Stall CPU, fetch new block from memory, identify oldest block, replace, start CPU Write hit –Update cache, depending on strategy update memory Write miss –Similar to read miss, first update memory, fetch block, replace, start CPU

Write Strategies Write-through –Update memory whenever cache is changed Copy back –Update memory only when block is deleted from cache –Keep track of changes with dirty bit –Makes read miss more expensive but improves the write time significantly Buffered write-through –Keep track of updates in special buffer, buffer transfers data slowly to memory