Introduction to Computer Science

Slides:



Advertisements
Similar presentations
Computer Skills Preparatory Year Presented by: L. Obead Alhadreti.
Advertisements

Introduction to Computers 2010 Class: ________________ Name: ________________.
History of Computers.
11 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall.
Lecture 1 “History and Evolution of Computers” Informatics.
Chapter Chapter Goals Describe the layers of a computer system Describe the concept of abstraction and its relationship to computing Describe.
Appendix The Continuing Story of the Computer Age.
CS 104 Introduction to Computer Science and Graphics Problems History of Computer 09/05/2008 Yang Song (Prepared by Yang Song and Suresh Solaimuthu)
Some of these slides are based on material from the ACM Computing Curricula 2005.
Chapter 0: Introduction
Prepared by: Jasper Francisco. The Early Years 1  In the early years, before the computer was invented, there were several inventions of counting machine.
KEYBOARD – an input device used to type data.
 Some consider the first computer to be the abacus which was invented by the Chinese around 3000B.C. to perform arithmetic processes.  In 1642, Blaise.
Evolution of Computers
Chapter 1 An Introduction to Computer Science
Chapter 01 Nell Dale & John Lewis.
Chapter 0: Introduction Computer Science: An Overview Eleventh Edition
History of Computers Abacus – 1100 BC
CREATION OF THE COMPUTER & THE GRAND IDEAS OF COMPUTER SCIENCE
Wilhelm Schickhard (1623) Astronomer and mathematician Automatically add, subtract, multiply, and divide Blaise Pascal (1642) Mathematician Mass produced.
The History of Computers
The History of Computers. People have almost always looked for tools to aid in calculation. The human hand was probably the first tool used to help people.
CCSE251 Introduction to Computer Organization
-The trade of goods -The expansion of commerce -Evolution of tools for calculations A sumerian clay tablet.
Introduction Chapter 1. 1 History of Computers Development of computers began with many early inventions: The abacus helped early societies perform computations.

 the gradual transformation or development of certain specie to a new form.
History of Computers By: Madelyn Skinner Just Another Name? There are more people, that we have knowledge on, who helped invent the computer. We will.
Chapter 0 Introduction Yonsei University 1 st Semester, 2012 Sanghyun Park.
Chapter 1 The Big Picture.
1.1 The Computer Revolution. Computer Revolution Early calculating machines Mechanical devices used to add and subtract By Babylonian (Iraq) 5000 years.
Microprocessor Fundamentals Week 1 Mount Druitt College of TAFE Dept. Electrical Engineering 2008.
About the Presentations The presentations cover the objectives found in the opening of each chapter. All chapter objectives are listed in the beginning.
CMSC 120: Visualizing Information 1/29/08 Introduction to Computing.
© 2007 Pearson Addison-Wesley. All rights reserved 0-1 Spring(2007) Instructor: Qiong Cheng © 2007 Pearson Addison-Wesley. All rights reserved.
Computer Science What is Computer Science? Algorithm Design and Analysis Organization and Architecture Artificial Intelligence Databases Operating Systems.
COMP 268 Computer Organization and Assembly Language A Brief History of Computing Architecture.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. PowerPoint to accompany Krar Gill Smid Technology of Machine.
History of Computers.
Chapter 0 Introduction © 2007 Pearson Addison-Wesley. All rights reserved.
Introduction to Computer Science by Kai-Lung Hua Chapter 0: Introduction.
Chapter 1 Introduction.

Dannelly's Very Short History of Computing CSCI 101.
History of Computers! Claire Bromm March 28 th, 2012.
History of Computer Wyatt Feiling Did you know... The first idea for a computer was in the early 1800s Charles Babbage is the man who is credited with.
Computer History How did we get here?.
Why build a computer? u Computers were developed to mechanize mathematical computations. u Two definitions:  A computer is “a programmable electronic.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Computer Science: An Overview Tenth Edition by J. Glenn Brookshear Chapter.
History of Computers March 26, 2012Greer Potadle.
Unit 1 PC Literacy & Systems Design Lesson 1. History of Computing Lesson 1.
Copyright © 2015 Pearson Education, Inc. Chapter 0: Introduction.
Introduction Basic Computer Concepts Abacus  ultimate ancestors of today’s computers became the arithmetic tool of early merchants the only relation.
Chapter 0 Introduction. © 2005 Pearson Addison-Wesley. All rights reserved 0-2 Chapter 0: Introduction 0.1 The Role of Algorithms 0.2 The Origins of Computing.
Information Age “An in depth look at the exciting history of the Calculator and Computer”
Courtney Nielsen  Help us find info  Storage  Performs calculations  Runs software  communication  Storing data  Research  Fact checking  Communication.
Evolution of the Computer. Zeroth Generation- Mechanical 1.Blaise Pascal –Mechanical calculator only perform Von Leibiniz –Mechanical.
Chapter 0: Introduction
Chapter 1 The Big Picture
History of Computers Abacus – 1100 BC
Computer Applications
Chapter 0 Introduction © 2007 Pearson Addison-Wesley.
=
BIC 10503: COMPUTER ARCHITECTURE
Chapter 0: Introduction
History of Computers - Long, Long Ago
HISTROY CLASS EVOLUTION..
Chapter 0: Introduction
Chapter 0: Introduction
Presentation transcript:

Introduction to Computer Science CS A101

What is Computer Science? First, some misconceptions. Misconception 1: I can put together my own PC, am good with Windows, and can surf the net with ease, so I know CS. Misconception 2: Computer science is the study of how to write computer programs. Misconception 3: Computer science is the study of the uses and applications of computers and software.

Computer Science Computer science is the study of algorithms, including Their formal and mathematical properties Their hardware realizations Their linguistic realizations Their applications

What Will We Cover? Broad survey of computer science topics, some depth in programming, more on breadth Topics History Data representation Computer architecture (software perspective) Operating Systems Networking Algorithms Theory Database Systems Programming (more depth than other topics)

Terminology Algorithm: A set of steps that defines how a task is performed Program: A representation of an algorithm Programming: The process of developing a program Software: Programs and algorithms Hardware: Physical equipment

History of Algorithms The study of algorithms was originally a subject in mathematics. Early examples of algorithms Long division algorithm Euclidean Algorithm Gödel's Incompleteness Theorem: Some problems cannot be solved by algorithms.

Example: Euclid’s algorithm

Central Questions of Computer Science Which problems can be solved by algorithmic processes? How can algorithm discovery be made easier? How can techniques of representing and communicating algorithms be improved? How can characteristics of different algorithms be analyzed and compared?

Central Questions of Computer Science (continued) How can algorithms be used to manipulate information? How can algorithms be applied to produce intelligent behavior? How does the application of algorithms affect society?

The central role of algorithms in computer science

Abstraction Abstraction: The distinction between the external properties of an entity and the details of the entity’s internal composition Abstract tool: A “component” that can be used without concern for the component’s internal properties Abstraction simplifies many aspects of computing and makes it possible to build complex systems

A Brief History of Computing Roots in Mathematical Sciences and computational devices Abacus, counting device, state Blaise Pascal, the Pascaline 1642 Manual gear system to add numbers Charles Babbage Difference Engine designed in 1812 Could not be built using the tools of the era Eventually built later using modern tools Analytic Engine 1823, steam-powered more general computational device with conditional controls Also too complex to build in the 19th century

Roots of Computing… Herman Hollerith’s Tabulating Machine Former MIT lecturer, developed a machine to read punch cards Inspired by a train conductor to punch tickets Used in the 1890 US Census Company became IBM in 1924

Roots of Computing… 1940, Conrad Zuse’s Z3 First computing machine to use binary code, precursor to modern digital computers 1944, Harvard Mark I, Howard Aiken 1946, ENIAC, first all digital computer Ushered in the “Mainframe” era of computing “First Generation” 18,000 vacuum tubes Similar to a lightbulb but plate in middle controls flow of electrical current

1.7 The von Neumann Model On the ENIAC, all programming was done at the digital logic level. Programming the computer involved moving plugs and wires.

Roots of Computing… 1945: John von Neumann defines his architecture for an “automatic computing system” Basis for architecture of modern computing Computer accepts input Processes data using a CPU Stores data in memory Stored program technique, storing instructions with data in memory Produces output Led to the EDVAC and UNIVAC computers

Roots of Computing… 1951, UNIVAC, Universal Automatic Computer When we say there is a “bug” in the program, we mean it doesn’t work right… the term originated from an actual moth found in the UNIVAC by Grace Hopper

The Second Generation: Transistors Chapter 1 The Second Generation: Transistors Invented 1947, Bell Labs: Bardeen, Shockley, Brattain 1958 -1964 Transistors generate less heat Transistors are smaller, faster, and more reliable First transistors smaller than a dime UNIVAC II built using transistors Transforming invention! Like the invention of the steam engine. First transistor about the size of a thumb, with a paperclip, gold foil, germanium, and coiled wire. Germanium is, like silicon, a semiconductor – it only conducts a trickle of electricity. This can be used to control the amplification of signals by changing the material from a conductor to an insulator. Actually an accident, some evidence Shockley was annoyed and was looking for something else. It took until 1948 to see the real value and in 1951 you could license transistor technology for $25K. Then Shockley went on to start the seeds of Silicon Valley. The 2nd generation UNIVAC was produced with transistors instead of vacuum tubes with a greater performance. It could perform thousands of operations a second. The Computer Continuum

The Third Generation: Integrated Circuits (IC) Chapter 1 The Third Generation: Integrated Circuits (IC) 1964 -1990 Multiple transistors on a single chip IBM 360 - First mainframe to use IC DEC PDP-11 - First minicomputer End of mainframe era, on to the minicomputer era The 360 also used the first general-purpose operating system which allowed computers to be more easily programmed. In 1971 Intel invented the 4004, the world’s first microprocessor which led to the world of desktop computing of today. William Gates and Paul Allen founded Microsoft with a version of BASIC written for the Altair. Jobs and Wozniak built the first Apple computer in their garage in 1976. By the time Intel announced the Pentium processor in 1993 the Wintel (Windows/Intel) dynasty was off and running. Microprocessors have become 5,000 times faster in 24 years. The Computer Continuum

Integrated Circuit Invented at TI by Jack Kilby, Bob Noyce "What we didn't realize then was that the integrated circuit would reduce the cost of electronic functions by a factor of a million to one, nothing had ever done that for anything before" - Jack Kilby

Minicomputer Era Made possible by DEC and Data General Corporation, IBM Medium-sized computer, e.g. DEC-PDP Much less expensive than mainframes, computing more accessible to smaller organizations Used transistors with integrated circuits

Personal Computer Era First microprocessor, Intel 4004 in 1971 MITS Altair “kit” in 1975 Apple in 1976 IBM PC in 1981 using 8086 Macintosh in 1984, introduced the GUI (Graphical User Interface) we still use today Some critics: Don Norman on complexity Next interface delegation instead of direct manipulation?

Today: Internetworking Era? Computer as communication device across networks World Wide Web, Internet Publishing, data sharing, real-time communications

Supercomputers IBM Roadrunner Chapter 1 Supercomputers The most powerful and expensive computers Contain numerous very fast processors that work in parallel IBM Roadrunner 1,105 TeraFlops (Floating Point Operations/Second) 12,960 IBM PowerXCell 8i and 6,480 AMD Opteron dual-core processors At 2 TeraFlops, could do in 1 second what would take every man, woman, and child 125 years work nonstop on hand calculators Used by researchers and scientists to solve very complex problems Cost millions of dollars Sandia National Laboratories used a supercomputer to model fallout from nuclear explosions. More parallelism does not equal more speed! Need specific applications that require this type of parallel processing. Studies have indicated the inherent amount of parallelism is 2-3. ASCI White by IBM, 12 TeraFlops, size of two basketball courts, 8192 copper processors Intel’s ASCI RED at Sandia with 9400 200Mhz pentium pro processors. It has 300GB of RAM and costs $55 million to build. 1.8 TeraFlops on benchmarks, more than all the mainframes of the world combined. It would take every man, woman, and child 125 years to work nonstop on hand calculators to do what this computer can do in 1 second. Eli Lilly and Dow Chemical used one to design and evaluate new combinations of chemical elements to identify promising drugs. Los Alamos used them for nuclear simulations. Weather, particle simulators. They require an extensive staff to operate and maintain. SETI @Home? Over 2,000,000 users. 375,000 years of computing time. 15 TeraFLOPs, more than ASCI RED! The Computer Continuum

CPU Clock Speeds Chapter 1 The Computer Continuum More Mhz is better, but this is not a good benchmark *across* different computer architectures. Mhz refers to the clock speed, so 1 Mhz is 1/1000000. A 1Mhz machine could theoretically execute a million instructions per second. But what an instruction is varies across architectures. On a pentium it may take 100 instructions to do a task, while only 10 instructions on some other type of machine. The Computer Continuum

Moore’s Law 1965: Computing power doubles ~ every 18 months                                                                                                               

Chapter 1 Chip Production Ingot of purified silicon – 1 meter long, sliced into thin wafers Chips are etched – much like photography UV light through multiple masks Circuits laid down through mask Process takes about 3 months Sliced using diamond saws Yield is the % of usable chips per wafer View of Cross-Section The Computer Continuum

Fabrication “wires” – chemical or vapor deposition SiO2 gate Doping Annealing

The Shrinking Chip Human Hair: 100 microns wide Bacterium: 5 microns Chapter 1 The Shrinking Chip Human Hair: 100 microns wide 1 micron is 1 millionth of a meter Bacterium: 5 microns Virus: 0.8 microns Early microprocessors: 10-15 micron technology 1997: 0.35 Micron 1998: 0.25 Micron 1999: 0.18 Micron 2001: 0.13 Micron 2003: 0.09 Micron 2007: 0.065 Micron 2009: 0.045 Micron Physical limits believed to be around 0.016 Microns, should reach it around 2018 Smaller process technology means we can generate less heat, pack more transistors into the same area as before The Computer Continuum

Size