CS 300 – Lecture 1 Intro to Computer Architecture / Assembly Language Welcome Back!

Slides:



Advertisements
Similar presentations
Chapter 5 Computing Components. The (META) BIG IDEA Cool, idea but maybe too big DATA – Must be stored somewhere in a storage device PROCESSING – Data.
Advertisements

CS 197 Computers in Society History of Computing.
Chapter Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing Describe.
Computers in Society History of Computing. Homework Assignment #3 is ready to go – let’s have a look. Questions about HW1? More people to schedule for.
Traffic Light Behavior IF A=1 AND B=0 Car Sensors B A.
The Logic Machine We looked at programming at the high level and at the low level. The question now is: How can a physical computer be built to run a program?
Chapter 1. Introduction This course is all about how computers work But what do we mean by a computer? –Different types: desktop, servers, embedded devices.
CS 197 Computers in Society Computing Devices. Today I see teams. I would like links from people to teams. Any questions on the wiki? We'll go for a short.
Computers in Society Day 4: History of Computing.
Computer Architecture Instructor: Wen-Hung Liao Office: 大仁樓三樓 Office hours: TBA Course web page:
1  1998 Morgan Kaufmann Publishers Lectures for 2nd Edition Note: these lectures are often supplemented with other materials and also problems from the.
CS 300 – Lecture 23 Intro to Computer Architecture / Assembly Language Virtual Memory Pipelining.
©TheMcGraw-Hill Companies, Inc. Permission required for reproduction or display. COMPSCI 125 Introduction to Computer Science I.
EET 4250: Chapter 1 Performance Measurement, Instruction Count & CPI Acknowledgements: Some slides and lecture notes for this course adapted from Prof.
CS 300 – Lecture 2 Intro to Computer Architecture / Assembly Language History.
CS 300 – Lecture 3 Intro to Computer Architecture / Assembly Language Digital Design.
1 The development of modern computer systems Early electronic computers Mainframes Time sharing Microcomputers Networked computing.
Computer Systems CS208. Major Components of a Computer System Processor (CPU) Runs program instructions Main Memory Storage for running programs and current.
1 Chapter 1 The Big Picture. 2 2 Computing systems are dynamic entities used to solve problems and interact with their environment. They consist of devices,
Chapter 1 Sections 1.1 – 1.3 Dr. Iyad F. Jafar Introduction.
COM181 Computer Hardware Ian McCrumRoom 5B18,
SHARANPREET SIDHU IT/9. The first generation of computers is said by some to have started in 1946 with Eniac, the first 'computer' to use electronic.
Chapter 1 The Big Picture Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing.
Chapter 01 Nell Dale & John Lewis.
The Study of Computer Science Chapter 0 Intro to Computer Science CS1510, Section 2.
Chapter 1 CSF 2009 Computer Abstractions and Technology.
BACS 287 Basics of Programming BACS 287.
Introduction Dr. Bernard Chen Ph.D. University of Central Arkansas Spring 2009.
Wilhelm Schickhard (1623) Astronomer and mathematician Automatically add, subtract, multiply, and divide Blaise Pascal (1642) Mathematician Mass produced.
The Computer Systems By : Prabir Nandi Computer Instructor KV Lumding.
The Study of Computer Science Chapter 0 Intro to Computer Science CS1510.
Foundations of Computer Science Computing …it is all about Data Representation, Storage, Processing, and Communication of Data 10/4/20151CS 112 – Foundations.
Chapter 1 The Big Picture.
1.1 The Computer Revolution. Computer Revolution Early calculating machines Mechanical devices used to add and subtract By Babylonian (Iraq) 5000 years.
EET 4250: Chapter 1 Computer Abstractions and Technology Acknowledgements: Some slides and lecture notes for this course adapted from Prof. Mary Jane Irwin.
Computers Are Your Future Eleventh Edition Chapter 2: Inside the System Unit Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall1.
Sogang University Advanced Computing System Chap 1. Computer Architecture Hyuk-Jun Lee, PhD Dept. of Computer Science and Engineering Sogang University.
CMSC 120: Visualizing Information 1/29/08 Introduction to Computing.
Chapter 1 Introduction.
Computer Architecture And Organization UNIT-II General System Architecture.
Computer Engineering Rabie A. Ramadan Lecture 1. 2 Welcome Back.
Chapter 1 Computer Abstractions and Technology. Chapter 1 — Computer Abstractions and Technology — 2 The Computer Revolution Progress in computer technology.
EEL 4713/EEL 5764 Computer Architecture Spring Semester 2004 Instructor: Dr. Shonda Walker Required Textbook: Computer Organization & Design, by Patterson.
COMP 1321 Digital Infrastructure Richard Henson University of Worcester October 2015.
Stored Programs In today’s lesson, we will look at: what we mean by a stored program computer how computers store and run programs what we mean by the.
12/13/ _01 1 Computer Organization EEC-213 Computer Organization Electrical and Computer Engineering.
Computer Organization Instructors Course By: Lecturer: Shimaa Ibrahim Hassan TA: Eng: Moufeda Hussien Lecture: 9:45 Address:
Computer Systems. Bits Computers represent information as patterns of bits A bit (binary digit) is either 0 or 1 –binary  “two states” true and false,
Concepts of Engineering and Technology Copyright © Texas Education Agency, All rights reserved.
Simple ALU How to perform this C language integer operation in the computer C=A+B; ? The arithmetic/logic unit (ALU) of a processor performs integer arithmetic.
Courtney Nielsen  Help us find info  Storage  Performs calculations  Runs software  communication  Storing data  Research  Fact checking  Communication.
 A computer is an electronic device that receives data (input), processes data, stores data, and produces a result (output).  It performs only three.
Information Technology INT1001 Lecture 2 1. Computers Are Your Future Tenth Edition Chapter 6: Inside the System Unit Copyright © 2009 Pearson Education,
Computer Organization IS F242. Course Objective It aims at understanding and appreciating the computing system’s functional components, their characteristics,
1 Chapter 1 Background Fundamentals of Java: AP Computer Science Essentials, 4th Edition Lambert / Osborne.
Chapter 1 Introduction.
Basic Computer Hardware and Software.
The Study of Computer Science Chapter 0
Computer Organization and Machine Language Programming CPTG 245
Chapter 1: An Overview of Computers and Programming Languages
Chapter 1 The Big Picture
Basic Computer Hardware & Software
The Study of Computer Science Chapter 0
Architecture & Organization 1
HISTORY OF COMPUTER AND DEVELOPMENT BY: OMAR MAZHAR
The Study of Computer Science
Architecture & Organization 1
The Study of Computer Science
The Study of Computer Science Chapter 0
Presentation transcript:

CS 300 – Lecture 1 Intro to Computer Architecture / Assembly Language Welcome Back!

Calling All Seniors Come see me about capstone projects!

The Text We're using "Patterson / Hennessy". You should have this by now. You should also have the CD. If not, I'll help you out. Note that some chapters are on the CD only! We'll use appendix B from the CD

Our Goals * Understand how computing hardware works * Investigate the differences between hardware and software and why they are important * Understand low level programming (C and assembler) * Find out how high level programs are translated into machine level programs * Understand the performance of programs * Look at parallelism

Course Outline * Introduction (.5 wks) * History (.5 wks) * Logic circuits and digital design (2 wks) * Basic Machine language and the C programming language (2 wks) * Computer Arithmetic (2 wks) * Understanding performance (1 wk) * Datapath and control (1 wk) * Pipelining (2 wks) * Memory issues (2 wks) * Storage and IO (1 wk) * Multi-processing (1 wk)

Work Required * Quizzes: 1 per chapter in the book, about 20 minutes each * Homework assignments: 1 per week (approximately) * Comprehensive final * Classroom participation (come prepared!) * No "big projects" but some homeworks will involve small amounts of software

Danger! I've chosen a "classic" textbook for this class. It's normally used in a 2 or 3 semester sequence of classes. I plan to cherry-pick the good stuff and avoid getting into too much depth. Please let me know if the book gets over your heads. I'll also be doing some units using a simple embedded processor like the PIC that will be easier to understand.

What is a computer? A computer is really just a collection of 4 basic things: Computational element that performs reasoning Storage elements that retain data Communication elements that move data from one computer to another Sensing and controls that allow the computer to interact with its environment What are some examples of these elements?

History Each element of a computer has evolved separately. What are the ancestors of these elements? Computing Storage Communication Sensing and control

A Historical Approach We’re going to dive in to the innards of computing devices from a historical perspective. Why? Older devices were a lot simpler yet addressed the same problems It is interesting to address the context of each element of a computer from an evolutionary perspective History is cool

The Computing Element John Von Neumann, one of the pioneers of computing, used the word “Organ” to describe these elements. The biological metaphors started from day 1 … The original computing element was the human brain. But eventually mechanical devices were created to speed up the calculation process. The apex of mechanical computing was Babbage’s “analytical engine”, a device too complex to ever work. This early computing was mathematical – building tables of numbers for navigation and engineering purposes.

Historical Computing Devices

Electronic Computing The big innovation in computing was the replacement of mechanical computing devices by purely electronic ones. A gear or relay is too big / slow / unreliable to use in large quantities. An electronic switch has no moving parts – it operates by pushing electrons around. The original electronic computers used vacuum tubes – later transistors took over.

Electronic Gates A gate is a device in which one signal controls another. In a vacuum tube, the grid could block or allow flow from input to output. So this is just like a relay. Transistors are very similar – just a lot smaller.

Silicon The “computer revolution” came about when VLSI technology allowed a single chip to contain LOTS of transistors. A Pentium has about 50 million transistors. That would have been a lot of vacuum tubes. Manufacturing cost is something like $ per transistor.

Assessing Computation How can we assess a computational technology? This turns out to be REALLY HARD! Knowing how fast a device can do one task doesn’t tell us a lot about other tasks. Approaches: Clock rate (not very accurate) MFLOP (only helps for numeric calculations) Specific benchmarks Units: tasks / second

Information Storage Storing information is as important as processing it. This all started with written language: Important ideas: Precise relationship between spoken and written languages Ability to make a “perfect copy” of a document A medium (clay, paper, …) is used to preserve information over time

Information Access A large information repository is much more useful if it can be accessed quickly via mechanical means. Punch cards predate computers (by a long shot!) and were used to store and process large volumes of information. A key insight was that alphabetic information can be processed as if it is numeric Herman Hollerith patented a system in which needles sensed the presence or absence of holes in a card. This converted information into electric impulses. His machine was used for the 1890 census What company did he start?

Storage Media

Assessing Storage Technology Read/write or read-only Latency (time it takes to find what you want) (time) Transfer rate (how fast you get the information) (bits / second) Capacity (bits) Cost / bit ($) Error rate (errors / bit) Durability (time)

Interfacing Getting (electronic) information from or to the real world is another BIG part of computing. The first big breakthrough was a loom controlled by punched cards.

Interface Technology The big idea here is converting between electronic representation and human sensing for audio and video objects. Other interface technology includes pointing (mouse), typing (keyboard), and even GPS. Interfaces usually come in two “layers” – one that is specific to the device and one “general” part like USB or Firewire.

Babbage’s Insight Instead of programming a computer mechanically, use the storage to encode the program. That is, instead of building a machine to accomplish just one task, build a general machine that could be programmed to do any task (a “stored program” computer). The same data that a program manipulates can also be the program that controls the machine.

Building Electronic Computers So what is it that makes a computer go? As you peel back the layers (circuit boards, chips, memory, processing, …) you finally get to the “bottom” – the indivisible atoms that a computer is composed of: Logic Gates

Logic Gates The basic building block of a digital computer is the “logic gate”, a hardware device that calculates a very simple function on 1’s and 0’s. Logic gates have been constructed with mechanical relays, vacuum tubes, and (mainly) transistors on an integrated circuit. Logic gates use power as they switch – that’s why you have a fan in your computer. We assess the “quality” of a logic gate in a number of ways: speed, size, energy use

Moore’s Law Technology improves at an exponential rate. Integrated circuit capacity (number of gates) doubles every 18 months or so. Problems: some systems improve much faster than others (disk speed doesn’t increase as fast as processor speed). Circuit size can’t keep getting smaller forever. It’s very hard to make all of the transistors on a chip useful – doubling the number of transistors does not double the effective speed of a processor

How Logic Gates Work A gate has “inputs” and “outputs”. The outputs are determined by the inputs. For a relay: input is voltage on the coil, outputs are connections between terminals Transistors are similar except a lot smaller Key characteristics: Delay: how soon the output has the “right” answer Power dissipation: how much heat is generated Size: how much silicon is needed on the chip

Building Logic Gates Many different sorts of logic gates are used in computing A Pentium has about 50,000,000 transistors

A Universal Logic Gate Nand: X Y Output ( 1 = True, 0 = False) Let’s create a “Circuit” using these gates …

Boolean Algebra Logic designers use boolean algebra to understand their circuits. There is a lot of math in the boolean world; writing + for or and * for and, a*(b*c) = (a*b)*c a*(b+c) = a*b+a*c a+a’ = 1 Truth tables are used to specify boolean functions

Adding Two Numbers Truth table for addition: x y Sum Carry Inputs Outputs

Memory Consider this circuit:

Wires, Clocks, and Words A clock is the “heartbeat” a computer. The signal on a wire carries a new value at each clock. The clock determines the flow rate of information along a wire. If you want information to flow faster you use more wires (bits). A “word” is fixed number of bits/wires used in calculations and storage. A “byte” is an 8 bit word (often used to store characters)

More About Hardware All the gates can work in parallel: hardware is not restricted to a “one step at a time” like programs generally are. Wire length is just as big a problem as gate speed: signals move at just around the speed of light but that’s not very fast on a chip.

Scalability Remember that sizes don’t mean much. A 64 bit machine doesn’t run 2x faster than a 32 bit one; a 50M transistor chip isn’t twice as fast as a 25M one, and a 4 GHz machine won’t execute programs twice as fast as a 2GHz one (why???). Will a task get done twice as fast if you use twice as many workers?

Stored Programs The key idea in a computer is to execute arbitrary programs for the user. Computers use "machine language" - language understood by the hardware that controls what the computer does. Different processors have different machine languages. Somehow, all of our programs need to get turned into this machine language.

Instructions Machine language breaks the computational process into instructions. An instruction is just data (0s and 1s) that is executed. Instructions have two representations: * Binary (in the computer) * Textual (the assembler or textbook) The translation between binary and textual representations is done using an assembler (or disassembler).

Data Data in an executing program can live in many different places: * Memory * Registers * Disk * Cache An instruction mutates this data (the "processor state") in some predefined way

Programming * Bad old days: Program by rewiring the computer logic * Early computers: program written in binary * Back in my day: program written in assembly language * Low level languages (Fortran, C, Pascal) * High level languages (Java, Haskell, C#, …) What's the difference between low and high level languages?

About This Course Why are we here? * Systems are composed of both "hard" and "soft" components. We need to understand what "hard" components can do * Hardware is fundamentally different from software: it is inherently parallel. We'll learn about exploiting parallelism in computation * All "soft" programs are executed on hardware – we can't understand how a program performs without knowing a lot about hardware * Many issues in hardware design are also present in software systems

A Plethora of Systems * High performance computing * Servers * Desktop systems * Highly functional embedded applications (cell phones, PDAs, cameras) * Minimally functional embedded applications (calculators, toys, controllers) Understanding architecture helps a lot at both ends of this spectrum

A Short History of Computer Architecture * 40's, 50's; Simple architectures, emphasis on building hardware capable of carrying out basic math and control * 60's: Development of "tricks" in the processor to make execution go faster * 70's: mini-computers (PDP8), super computers (Cray 1) * 80's: RISC vs CISC debate, Language driven architecture * 90's: Increasing clock speed, multi-processor architectures, memory concerns, networking * Recent: clock speed limitations: parallel processing, cheap embedded systems, pervasive use of computing. Networking becomes very important.