Presentation is loading. Please wait.

Presentation is loading. Please wait.

Memory 5/26/2018.

Similar presentations


Presentation on theme: "Memory 5/26/2018."— Presentation transcript:

1 Memory 5/26/2018

2 Computer Memory Basics
If your computer's CPU had to constantly access the hard drive to retrieve every piece of data it needs, it would operate very slowly. When the information is kept in memory, the CPU can access it much more quickly. 5/26/2018

3 Random Access Memory Memory that’s been stored on Tape drives can only be accesses Sequentially The only way to locate specific items is to spin the tape until the location comes up By assigning addressable spaces to store information it can be called up Randomly Random Access Memory is an extremely fast way to locate data The word random thus refers to the fact that any piece of data can be returned in a constant time, regardless of its physical location and whether or not it is related to the previous piece of data. 5/26/2018

4 Types of RAM Modern types of writable RAM
generally store a bit of data in either the state of a flip-flop, as in SRAM (static RAM), Store a charge in a capacitor (or transistor gate), as in DRAM (dynamic RAM), 5/26/2018 4

5 SRAM SRAM chips Early SRAM chips came in 20-pin and larger DIPs (Dual Inline Packages). They had to have a large number of pins because there had to be a pin for each address signal a Data In pin and a Data Out pin some control pins (Write Enable, Chip Select) Ground and Power pins. 5/26/2018 5

6 DRAM DRAM chips DRAM, as I've mentioned earlier, is more complicated than SRAM because the charges placed on its memory cells leak out over time. So that charge has to be refreshed if the DRAM is actually going to be useful as a data storage device. Reading from or writing to a DRAM cell refreshes its charge, so the most common way of refreshing a DRAM is to read periodically from each cell. DRAM cells are about 4 to 6 times smaller than SRAM cells? If you can fit four times the amount of cells on a DRAM chip, this would mean that you'd need more address pins on it. 5/26/2018 6

7 Types of RAM DIP (dual in-line package) SIMM DIMM
Older computer systems used DIP memory directly, either soldering it to the motherboard. However, this arrangement caused many problems. Chips directly soldered onto the motherboard would mean the entire motherboard had to be trashed if any of the memory chips ever went bad. Newer systems do not use DIP memory packaging directly. The DIPs are soldered onto small circuit boards called memory modules; the two most common being the single inline memory module or SIMM and the dual inline memory module or DIMM. DIP chips had one notorious problem they crept out of their sockets over time as the system went through thermal cycles SIMM DIMM 5/26/2018 7

8 DIP (dual in-line package)
Sockets for 16-, 14-, and 8-pin packages. 5/26/2018 8

9 Types of RAM DIP (dual in-line package)
SIMM (single in-line memory module) A chip was needed that was both soldered and removable, and that is exactly what was found in the module called a SIMM. SIMMs are available in two main physical types30-pin (8 bits plus an option for 1 additional parity bit) and 72-pin (32 bits plus an option for 4 additional parity bits)with various capacities and other specifications. The 30-pin SIMMs are physically smaller than the 72-pin versions, and either version can have chips on one or both sides. DIMM (dual in-line memory module) 5/26/2018 9

10 Types of RAM DIMM (dual in-line memory module)
A DIMM, or dual in-line memory module, comprises a series of dynamic random access memory integrated circuits. These modules are mounted on a printed circuit board and designed for use in personal computers, workstations and servers. DIMMs began to replace SIMMs (single in-line memory modules). The main difference between SIMMs and DIMMs is that DIMMs have separate electrical contacts on each side of the module. 5/26/2018 10

11 RAM CPU To keep the processor running requires a constant flow of data that is made up of Instructions and raw Data Data is temporarily stored in RAM where it can be quickly accessed as needed Each location (CELL) in memory has a unique address that the CPU uses to locate specific memory locations. Instructions are usually stored sequentially in RAM, loaded from the Hard drive or other long term device Random Access Memory 5/26/2018

12 RAM CPU When a Program is Loaded its moved from the Hard Drive into RAM where it is RUN by the processor Random Access Memory 5/26/2018

13 Loading a program is as simple as point and click when using Windows
The command is interpreted, and the selected program is loaded into Memory 5/26/2018

14 RAM Mouse Input Load Program CPU The CPU instructs the Hard Drive to locate and load the Program All communications to the CPU is done through memory. The Mouse selection is sent to the CPU through RAM and the program is located from the hard drive and loaded Random Access Memory 5/26/2018

15 RAM CPU Load Program Hard Drive The command comes through Memory to the Hard Drive requesting the program be located and loaded into RAM Random Access Memory 5/26/2018

16 RAM CPU Hard Drive After Locating the program the Hard Drive sends it to RAM, once the program is loaded the CPU RUNS it. Random Access Memory 5/26/2018

17 RAM CPU Once Loaded the CPU runs each instruction sequentially
Hard Drive Random Access Memory 5/26/2018

18 RAM CPU When the CPU generates data it’s stored it back in Memory temporarily Hard Drive Random Access Memory 5/26/2018

19 RAM CPU When the user is finished and issues a SAVE command the Data is moved onto the hard drive for Permanent storage Hard Drive Random Access Memory 5/26/2018

20 RAM As technology improved the CPU speeds increased dramatically
Unfortunately by the time the 386 was introduced memory couldn’t keep up with the demands of the CPU To help resolve the issue of slow memory a new technology was introduced CACHE memory was added with the 486 5/26/2018

21 CACHE Memory 80486 By placing High speed CACHE memory close to the CPU and adding a special controller that managed the Cache memory instructions and data could be made almost immediately available to the CPU Hard Drive Random Access Memory 5/26/2018

22 Cache Memory Cache (pronounced cash) memory is extremely fast memory that is built into a computer’s CPU, or located next to it on a separate chip. The cache is usually filled from main memory when instructions or data are fetched into the CPU. The advantage of cache memory is that the CPU does not have to use the motherboard’s system bus for data transfer. Whenever data must be passed through the system bus, the data transfer speed slows to the motherboard’s capability. The CPU can process data much faster by avoiding the bottleneck created by the system bus. 5/26/2018

23 Cache Memory Lets take a library as an example of how caching works.
Imagine a large library but with only one librarian (the standard one CPU setup). The first person comes into the library and asks for Lord of the Rings. The librarian goes off follows the path to the bookshelves (Memory Bus) retrieves the book and gives it to the person. The book is returned to the library once its finished with. Now without cache the book would be returned to the shelf. When the next person arrives and asks for Lord of the Rings, the same process happens and takes the same amount of time. If this library had a cache system then once the book was returned it would have been put on a shelf at the librarians desk. This way once the second person comes in and asks for Lord of the Rings, the librarian only has to reach down to the shelf and retrieve the book. 5/26/2018

24 Cache Memory As processors evolved new and improved Cache memory has been added Motherboard manufactures added a special socket that allowed the user to add extra Cache to the system, referred to as Layer 2 Cache The level 1 cache is the fastest and smallest memory, level 2 cache is larger and slightly slower but still smaller and faster than the main memory. Some systems today incorporate 3 layers of Cache with as much as 1MB of extra memory 5/26/2018

25 Disk Cache (Page Cache)
Disk caching works under the same principle as memory caching, but instead of using high-speed SRAM, a disk cache uses conventional main memory. It is usually managed by the operating system kernel The most recently accessed data from the disk (as well as adjacent sectors) is stored in a memory buffer. When a program needs to access data from the disk, it first checks the disk cache to see if the data is there. Disk caching can dramatically improve the performance of applications, because accessing a byte of data in RAM can be thousands of times faster than accessing a byte on a hard disk. 5/26/2018

26 Cache Hit When data is found in the cache, it is called a cache hit, and the effectiveness of a cache is judged by its hit rate. Many cache systems use a technique known as smart caching, in which the system can recognize certain types of frequently used data It is also referred in the book as (cache controller) The alternative situation, when the cache is consulted and found not to contain a datum with the desired tag, is known as a cache miss. The strategies for determining which information should be kept in the cache constitute some of the more interesting problems in computer science. 5/26/2018

27 Cache Memory 5/26/2018

28 Is more Cache always better?
Cache Memory Is more Cache always better?  5/26/2018


Download ppt "Memory 5/26/2018."

Similar presentations


Ads by Google