Unit 1 Evolution of Computing SD1230 Unit 1 Evolution of Computing
Objectives During this unit, we will cover the following course objectives: Describe the history and evolution of computing. Identify the characteristics of desktop, Web, and mobile technology.
Learning Outcomes Completing this unit should help enable you to: Describe the evolution of the computer and mobile computing and its economic and social impact. Discuss the impact of key trends in mobile computing technologies. Explain why standards are important. Describe the primary elements in a computer’s architecture. Explain how program data is organized in memory. Explain how an operating system manages multiple processes. Demonstrate how to launch, pause, and start the ITT Tech Lab virtual machine. Describe how running a virtual machine impacts available resources.
Share Computing Experiences What types of devices have you used? Tablet PC iPhone Macintosh Android Windows Phone
History of the Computer 19th century Tables used for calculation by “human computers” Charles Babbage The Difference Engine Analytical Engine Charles Boole Symbolic logic system Herman Hollerith Device to calculate the census data Used punched card for input Hollerith card Source: Dilligan, R. J. (1998). Computing in the Web age: A Web-interactive introduction. Boston, MA: Kluwer Academic Publishers.
History of the Computer Early 20th century 1925 – Vannevar Bush Differential analyzer 1930s – Universal Turing Machine Konrad Zuse Developed the Z-1 computer in 1935 Later developed the Z-2, Z-3, and Z-4 First programming language 1945 – The first “bug” was identified
History of the Computer Mid 20th century ENIAC Weighed 30 tons Consumed 160 kw of power EDVAC Von Neumann architecture Disk drives 1960s Mainframe computers
History of the Computer Programming Language Evolution 1949 – Short Code was introduced 1954 – IBM began developing FORTRAN 1958 – FORTRAN II, ALGOL, LISP 1959 – COBOL 1968 – Pascal 1970 – Smalltalk and B-language 1972 – C 1975 – TinyBASIC
History of the Computer The World Wide Web First developed in 1969 as ARPANET SGML DTD 1983 – Adoption of TCP/IP as the Internet standard protocol 1989 – Timothy Berners-Lee developed HTML
SGML Example
History of Computing Company Networks Apple LocalTalk Server-based network
History of Mobile Phones 1973 - 1988 The Brick Era Source: Fling, B. (2009). Mobile design and development: Practical techniques for creating mobile sites and Web apps. Sebastopol, CA: O’Reilly Media, Inc.
History of Mobile Phones 1988 - 1998 The Candy Bar Era
History of Mobile Phones 1998 - 2008 The Feature Phone Era
History of Mobile Phones 2002 to Present The Smart Phone Era
History of Mobile Phones Now The Touch Era Source: www.apple.com
What a Program Does Add Multiply Compare Input Output
Input Input comes from many places: Keyboard Mouse Microphone Files Databases Other software
Processing Processing can include: Mathematic operations Comparisons
Output Output can be to the: Screen File Database Printer Other software
How a Computer Processes Information A processor is an electronic device that can only understand 2 states: On Off The On state is represented by a 1. The Off state is represented by a 0.
Bits and Bytes 0111 0111 1010 Bit Nibble Byte
Bytes A byte can have a value between 0 and 255 0000 0000 1111 1111 255
Some Decimal-to-Binary Conversions Counting Up 1 0000 0001 2 0000 0010 3 0000 0011 4 0000 0100 5 0000 0101 6 0000 0110 7 0000 0111 8 0000 1000 9 0000 1001 10 0000 1010 11 0000 1011 Decimal Binary 12 0000 1100 13 0000 1101 14 0000 1110 15 0000 1111 16 0001 0000 17 0001 0001 18 0001 0010 19 0001 0011 20 0001 0100 21 0001 0101 22 0001 0111
Some Decimal-to-Binary Conversions Counting Down 255 1111 1111 254 1111 1110 253 1111 1101 252 1111 1100 251 1111 1011 250 1111 1010 249 1111 1001 248 1111 1000 247 1111 0111 246 1111 0110 245 1111 0101 Decimal Binary 244 1111 0100 243 1111 0011 242 1111 0010 241 1111 0001 240 1111 0000 239 1110 1111 238 1110 1110 237 1110 1101 236 1110 1100 235 1110 1011 234 1110 1010
Bitwise Operations AND OR NOT Input A Input B Operation output AND 1 AND 1 OR NOT
Adding in Binary carry 1 1 + 0 01 + 01 1 10
Adding in Binary A larger number carry 1 5 + 4 101 + 100 9 1001
Adding in Binary A larger number carry 1 1 7 + 6 0111 + 0110 13 1101
Hexadecimal To make binary numbers easier to read, they are often converted to hexadecimal. Hexadecimal is also called Base-16. Numbers 10 - 15 are represented by letters. A hexadecimal number is preceded by a 0x. Decimal Hexadecimal 10 A 11 B 12 C 13 D 14 E 15 F
Some Hexadecimal Numbers 0 + 0 0x00 1 0 + 1 0x01 2 0 + 2 0x02 10 0 + 10 0x0A 11 0 + 11 0x0B 12 0 + 12 0x0C 13 0 + 13 0x0D 14 0 + 14 0x0E 15 0 + 15 0x0F 16 16 + 0 0x10 17 16 + 1 0x11 Decimal Hexadecimal 18 16 + 2 0x12 19 16 + 3 0x13 20 16 + 4 0x14 21 16 + 5 0x15 22 16 + 6 0x16 23 16 + 7 0x17 24 16 + 8 0x18 25 16 + 9 0x19 26 16 + 10 0x1A 27 16 + 11 0x1B 17 16 + 12 0x1C
Memory Addresses Data and instructions are stored in memory. Specific blocks of data or instructions are accessed using a memory address. Memory addresses typically are represented in hexadecimal.
Summary In this unit, we covered the following topics: The history of computing The history of mobile phones The input-process-output model The binary number system The hexadecimal number system How computers use memory