Presentation is loading. Please wait.

Presentation is loading. Please wait.

Who invented the computer?

Similar presentations


Presentation on theme: "Who invented the computer?"— Presentation transcript:

1 Who invented the computer?
A quick trip through the history of computer science

2 Charles Babbage [1791-1871] 1834: Analytical Engine
English mathematician 1834: Analytical Engine first concept of programmable computer 30x10m machine, steam-engine powered input (program and data) punch cards (like in Jacquard’s looms) like a modern assembly language with loops and conditional branching output printer, plotter, and bell ! Joseph-Marie Jacquard ( ) invented a machine that could weave cloth with a pattern specified by a stack of punch card, thereby revolutionizing weaving practice first in France and then all over the world. “Although it is a wonderful invention, the Jacquard loom was no more a computer than is a player piano” . [M. Davis] Charles Babbage (December 26, 1791 – October 18, 1871) was an English mathematician, analytical philosopher and (proto-)computer scientist who was the first person to come up with the idea of a programmable computer. Parts of his uncompleted mechanisms are on display in the London Science Museum. In 1991, working from Babbage's original plans, a Difference Engine was completed, and functioned perfectly. They were built to tolerances achievable in the 19th century, indicating that Babbage's machine would have worked. The analytical engine, an important step in the history of computers, is the design of a mechanical, modern general purpose computer by the British professor of mathematics Charles Babbage. It was first described in 1837, but Babbage continued to work on his design throughout his life, which ended in Because of financial and technical (eg, legal problems) issues, the engine was never actually finished. It is generally acknowledged that the design was correct and that the engine would have worked, however the precision required for the gears would not be possible until decades later. Logically comparable general purpose computers did not come into existence until about 100 years later. Babbage began by designing and partially constructing his Difference engine, a mechanical special purpose computer designed to tabulate logarithms and trigonometric functions by evaluating approximating polynomials. When he realized that a much more general design was possible, he started to work on the analytical engine instead. The machine was to be powered by a steam engine and would have been over 30 metres long and 10 metres wide. The input (programs and data) were to be provided to the machine on punch cards, a method being used at the time to direct mechanical looms. For output, the machine was planned to have a printer, a curve plotter and a bell. The machine could also punch numbers onto cards to be read in later. It employed ordinary base-10 fixed point arithmetic. There was a store (ie, memory) capable of holding 1000 numbers of 50 digits each. An arithmetical unit (called the "mill") was able to perform all four arithmetical operations. The programming language to be employed was akin to modern day assembly languages. Loops and conditional branching were possible and so the language as conceived would have been 'equivalent' to modern computer programming languages in the range of programs which it could specify. Three different types of punch cards were used: one for arithmetical operations, one for numerical constants, and one for load and store operations, transferring numbers from the store to the arithmetical unit or back. There were three separate readers for the three types of cards. In 1842, the Italian mathematician Menabrea, who had met the travelling Babbage in Italy, wrote a description of the engine in French, which was translated into English and extensively annotated by Ada King, Countess of Lovelace in She had already become interested in the engine years earlier. Based on her additions to Menabrea's paper, she has been described as the first computer programmer. The modern computer programming language Ada, is named in her honour.

3 The Right Honourable Augusta Ada, Countess of Lovelace
[ ] child of poet Lord Byron translates Luis Menabrea’s memoir on the Analytical Engine appends a Set of Notes which specify also a method to calculate Bernoulli numbers with the Engine (the world's first computer program) speculates that the Analytical Engine could create graphics or compose music “We must say most aptly that the Analytical Engine weaves algebraic patterns just as the Jacquard-loom weaves flowers and leaves,” “we must say most aptly that the Analytical Engine weaves algebraic patterns just as the Jacquard-loom weaves flowers and leaves” A.A. Augusta Ada King, Countess of Lovelace (December 10, November 27, 1852) is mainly known for having written a description of Charles Babbage's early mechanical general- purpose computer, the analytical engine. Ada was the only legitimate child of the poet Lord Byron and his wife, Annabella Milbanke, a cousin of Lady Caroline Lamb, with whom he had an affair that scandalized Regency London. Ada was named after Byron's half-sister, Augusta Leigh, by whom he was rumoured to have fathered a child. It was Augusta who encouraged Byron to marry to avoid scandal, and he reluctantly chose Annabella. On January 16, 1816, Annabella left Byron, taking 1-month old Ada with her. On April 21, Byron signed the Deed of Separation and left England for good a few days later. He never saw either again. Her husband was William King, 8th Baron King, later 1st Earl of Lovelace. Her full name and title for most of her married life was The Right Honourable Augusta Ada, Countess of Lovelace. She is widely known in modern times simply as Ada Lovelace. During a nine-month period in , Ada translated for Babbage Italian mathematician Louis Menebrea's memoir (written originally in French) on Babbage's newest proposed machine, the Analytical Engine. With the article, she appended a set of Notes which specified in complete detail a method for calculating Bernoulli numbers with the Engine, recognized by historians as the world's first computer program. She speculated that such a machine could create graphics or compose music; Babbage never built a working model.

4 George Boole [1815-1864] 1847 1854 English Mathematician
“Mathematical Analysis of Logic” 1854 “An Investigation of the Laws of Thought, on which are founded the Mathematical Theories of Logic and Probabilities “ invented Boolean Algebra demonstrated that logic deduction could be developed as a branch of mathematics

5 Claude Shannon [1916-2001] 1937 the “father of information theory”
with his MIT master's thesis, “A Symbolic Analysis of Relay and Switching Circuits”, the “most important master’s thesis of the century”, essentially founded practical digital circuit design by showing how G. Boole’s algebra of logic could be used to design complex switching circuits Claude Elwood Shannon (April 30, February 24, 2001) has been called "the father of information theory". Shannon was born in Petoskey, Michigan and was a distant relative of Thomas Edison. While growing up, he worked as a messenger for Western Union. Shannon began studying electrical engineering and mathematics at the University of Michigan in 1932, and received his Bachelor's degree in He attended MIT for graduate school, where he worked on Vannevar Bush's differential analyser, an analog computer. He proved several results relating Boolean algebra to electronic logic networks (eg, relays and switches) in his 1937 MIT master's thesis, A Symbolic Analysis of Relay and Switching Circuits, and, with it, essentially founded practical digital circuit design. Professor Howard Gardner, of Harvard University, called that thesis "possibly the most important, and also the most famous, master's thesis of the century", and in 1940 it earned Shannon the Alfred Noble American Institute of American Engineers Award. After working at Cold Spring Harbor Laboratory, on genetics, Shannon worked on his PhD in 1940 at MIT. His PhD Thesis is titled An Algebra for Theoretical Genetics. He then worked at Bell Labs until he returned to MIT in the 50s. In 1948 Shannon published A Mathematical Theory of Communication (ISBN ). This work focuses on the problem of how to reconstruct at a target point the information a sender has transmitted. In this fundamental work he used tools in randomized analysis and large deviations, which were in their nascent development stages at that time. Shannon developed information entropy as a measure for redundancy while essentially inventing information theory. His later book with Warren Weaver, A Mathematical Theory of Communication, Univ of Illinois Press, is brief and surprisingly accessible to the non-specialist. Another notable paper published in 1949 is Communication Theory of Secrecy Systems, which essentially founded the mathematical theory of cryptography. He is also credited with the introduction of the Sampling Theory, which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. Shannon is known for his thinking prowess; many have testified that he was able to dictate entire academic papers from memory alone, without correction. Outside of his academic pursuits, Shannon was interested in juggling, unicycling, and chess. He also invented many devices, including a chess-playing machine, a rocket-powered pogo stick, and a flame-throwing trumpet for a science exhibition. He met his wife Betty Shannon when she was a typist at Bell Labs. From 1958 to 1978 he was a Professor at MIT. To commemorate his achievements, there were celebrations of his work in 2001, and there are currently 3 copies of a statue of Shannon: one at the University of Michigan, one at MIT and one at Bell Labs.

6 Alan Turing [1912-1954] 1936 British mathematician
the “father of Computer Science” 1936 "On Computable Numbers, with an Application to the Entscheidungsproblem“ Addressed the “decidability” of certain mathematical & logical problems Introduced concept of “Turing Machine” Can you write a computer program that will always be able to tell if another program will ever halt? Most modern computers are based on the concept of the “Turing Machine” The Entscheidungsproblem (German: decision problem) is the challenge in symbolic logic to find a general algorithm which decides for given first-order statements whether they are universally valid or not. Alonzo Church and independently Alan Turing showed in 1936 that this is impossible. As a consequence, it is in particular impossible to algorithmically decide whether statements in arithmetic are true or false. The question goes back to Gottfried Leibniz, who in the seventeenth century, after having constructed a successful mechanical calculating machine, dreamt of building a machine that could manipulate symbols in order to determine the truth values of mathematical statements. He realized that the first step would have to be a clean formal language, and much of his subsequent work was directed towards that goal. In 1928, David Hilbert and Wilhelm Ackermann posed the question in the form outlined above. A first-order statement is called “universally valid” or “logically valid” if it follows from the axioms of the first-order predicate calculus. Gödel’s completeness theorem states that a statement is universally valid in this sense if and only if it is true in every interpretation of the formula in a model. Before the question could be answered, the notion of “algorithm” had to be formally defined. This was done by Alonzo Church in 1936 with the concept of “effective calculability” based on his lambda calculus and by Alan Turing in the same year with his concept of Turing machines. The two approaches are equivalent, an instance of the Church-Turing thesis. The negative answer to the Entscheidungsproblem was then given by Alonzo Church in 1936 and independently shortly thereafter by Alan Turing, also in Church proved that there is no algorithm (defined via recursive functions) which decides for two given lambda calculus expressions whether they are equivalent or not. He relied heavily on earlier work by Stephen Kleene. Turing reduced the halting problem for Turing machines to the Entscheidungsproblem and his paper is generally considered to be much more influential than Church’s. The work of both authors was heavily influenced by Kurt Gödel’s earlier work on his incompleteness theorem, especially by the method of assigning numbers to logical formulas in order to reduce logic to arithmetic. Turing’s argument follows. Suppose we had a general decision algorithm for first order logic. The question whether a given Turing machine halts or not can be formulated as a first-order statement, which would then be susceptible to the decision algorithm. But Turing had proved earlier that no general algorithm can decide whether a given Turing machine halts. It is important to realize that if we restrict ourselves to a specific first-order theory with specified object constants, function constants, predicate constants and subject axioms, the truth of statements in that theory may very well be algorithmically decidable. An example of this is given by Presburger arithmetic. However, the general first-order theory of the natural numbers expressed in Peano’s axioms cannot be decided with such an algorithm. This also follows from Turing’s argument given above. Alan Mathison Turing (June 23, 1912–June 7, 1954) was a British mathematician, logician, and cryptographer, and is considered to be one of the fathers of modern computer science. He provided an influential formalization of the concept of algorithm and computation: the Turing machine. He formulated the now widely accepted 'Turing' version of the Church-Turing thesis, namely that any practical computing model has either the equivalent or a subset of the capabilities of a Turing machine. During World War II he worked on breaking German ciphers, particularly the Enigma machine; he was the director of the Naval Enigma section at Bletchley Park for some time and remained throughout the war the chief cryptanalyst for the Naval Enigma effort. After the war, he designed one of the earliest electronic programmable digital computers at the National Physical Laboratory and, shortly thereafter, actually built another early machine at the University of Manchester. He also, amongst many other things, made significant and characteristically provocative contributions to the discussion "Can machines think?“

7 The Colossus Computer 1943 Designed by Max Newman and T. Flowers at Bletchley Park, was the first combining all of digital, programmable, and electronic for cryptanalysis (breaking the Fish cyphers used in the communication at the highest levels of the Nazi regime) used vacuum tube electronics highly secret project (classified until 1976) it had no influence on future architectures The Colossus was the first programmable (to a limited extent) digital electronic computer, used for breaking German ciphers in World-War II. Intercepted German communications in the form of a punched paper tape were fed to the machine by an extremely fast tape reader. As the tape moved through the reader, beams of light passing through holes in the paper were read by photoelectric cells which passed the signal to the Colossus The ciphers broken by the Colossus were not Enigma-encripted. The Enigma-encripted (used for communication between the German submarines and their home bases were broken by Turing-designed electromechanical machine, called Bombe. ) Bletchley Park (also sometimes Station X) is an estate located in the town of Bletchley, now part of Milton Keynes, England. During World War II, Bletchley Park was the location of the United Kingdom's main codebreaking establishment. Codes and ciphers of several Axis countries were deciphered there, most famously the German Enigma. The high-level intelligence produced by Bletchley Park, codenamed Ultra, is frequently credited with aiding the Allied war effort and shortening the war, although Ultra's effect on the actual outcome of WWII is debated. Maxwell Herman Alexander Newman (February 7, February 22, 1984) was a British mathematician. Newman was born in Chelsea, London, England. From 1927 to 1945 he was a lecturer in mathematics at the University of Cambridge, where his 1935 lectures on Foundations of Mathematics inspired Alan Turing to embark on his pioneering work on computing machines. Newman was appointed head of the mathematics department at the University of Manchester, England, in 1945 and transformed it into a center of international renown, retiring in 1964. He was heavily involved in the design of the Colossus computer, used to crack the Lorenz cypher in World War II; his codebreaking section at Bletchley Park was termed the Newmanry. Newman wrote Elements of the topology of plane sets of points, a definitive work on general topology. He also made major contributions to combinatorial topology. He died in Comberton, near Cambridge. Newman and associates of Bletchley Park, and was built by the British Post Office Research Station at Dollis Hill, by Tommy Flowers and crew. Colossus was preceded by several computers, many first in some category. Zuse's Z3 was the world's first functional program-controlled computer, and was based on electromechanical relays, as were the (less advanced) Bell Labs machines of the late 1930s (George Stibitz, et al). The Atanasoff Berry Computer of circa 1937 was electronic and arguably the first working digital computer. Assorted analog computers were semiprogrammable, some of these much predated the 1930s (eg, Vannevar Bush). Babbage's Analytical Engine antedated all these (in the mid-1800s), and was both digital and programmable, but was only partially constructed and never functioned. Colossus was the first combining all of digital, programmable, and electronic Purpose It was primarily designed for cryptanalysis in an attempt to break one of the Fish cyphers (a Bletchley Park term) used by the German military for its most secure strategic communications. These were teletype cypher machines in the spirit of that first proposed by Col Parker Hitt of the US Army around WWI. The German machines were, essentially, attempts at an electromechanical implementation of the one-time pad cypher invented by Gilbert Vernam (Bell Labs) and Joseph Mauborgne (Signal Corps, USA) in the US at the end of WWI. The most important was a teletype based machine built by Lorenz Electric, the SZ-40 (and later SZ-42) Schlüsselzusatz (meaning, more or less, 'auxiliary key'). Another, different, teletype cypher machine was designed and built by Siemens & Halske, the T-52 Geheimfernschreiber (meaning, 'secret teleprinter'). Early versions of the Siemens machine (the T-52a and T-52b) were used to send signals between Germany and Norway over a cable running through Sweden. The Swedes tapped the cable, copied the traffic, and Arne Beurling, a Swedish mathematician, broke the cypher. Later production versions of the T-52 (there were variants through 'e') were considerably more secure, and quite hard to break even for Bletchley Park. Some of the T-52 traffic was also sent over Luftwaffe Enigma networks which were much more easily broken, and so T-52 traffic was a lower priority for Bletchley Park than might have otherwise been expected. The one-time pad requires a random sequence. It is combined with the plaintext (bit by bit, usually as character by character) resulting in the cyphertext which is transmitted. On receipt, the same random sequence is combined with the cyphertext (again usually character by character), and because the combining operation is reversible in a particular way (see XOR, for example) the output is the original plaintext. In the German Fish machines, the 'random' sequence was produced by various electromechanical arrangements (on one of them, these were rotors somewhat as in the US SIGABA machine), and the sequence wasn't actually random. Because there were patterns, they could be predicted if the cryptanalysts were sufficiently clever, and plaintexts thereupon recovered. In the case of the Lorenz machine, Col John Tiltman and Bill Tutte of Bletchley Park were sufficiently clever. In the case of the early Siemens machine, Beurling had been sufficiently clever. [edit] Origins The idea for Colossus developed out of a prior project which produced a special purpose opto-mechanical comparator machine called the Heath Robinson, and its successors the Old Robinson and Super Robinson. The main problem with the Robinsons was synchronising two paper tapes, one punched with the enciphered message, the other representing the patterns produced by the wheels of the Lorenz machine, that tended to stretch when being read at over 1000 characters per second. Colossus dispensed with the second tape by generating the wheel patterns electronically, and could process 5,000 characters (40 feet / 12m of tape) per second. Colossus Mark 2 was simpler to operate as well as being more advanced, and so greatly speeded the deciphering process, which was largely still carried out by hand. It included the first ever use of shift registers, enabling five simultaneous tests, each involving up to 100 Boolean calculations, on each of the five channels on the punched tape; i.e. up to 12.5 million calculations per second. It was not only able to break the wheel patterns (wheel breaking), but could also determine pin patterns (pin breaking). Both models were programmable using switches and plug panels, in a way the Robinsons had not been. The project was headed by the mathematician Max Newman. It started early in 1943 and the first version of the machine (Mark 1 Colossus) was finished and installed by about January 1944, to be followed by the improved Mark 2 Colossus in June Ten Mark 2 Colossus machines were in use at Bletchley Park by the end of the war. Most were destroyed after the war as part of 'protecting secrets' although two survived for many years and were used during the cold war. Design and operation Colossus (all versions) used state of the (then) art vacuum tubes (valves), thyratrons and photomultipliers to optically read a cyphertext from a paper tape and applied a programmable logical function to every character, counting how often this function returned "true". Although valves were generally considered to be liable to high failure rates it was recognised that failure occurred at power on and off so the Colossus machines, once turned on, were never powered down until the end of the war. The Colossus was efficient for its purpose. Even in 2004, Tony Sale notes that "Colossus is so fast and parallel that a modern PC programmed to do the same code-breaking task takes as long as Colossus to achieve a result!". Whilst Colossus featured limited programmability and was the first of the electronic digital machines to do so, it was not a true general purpose computer, not being Turing-complete. It was not then realized that Turing-completeness was significant; most of the other pioneering modern computing machines were not either (e.g. the ABC machine, the Harvard Mark I electro-mechanical relay machine, the Bell Labs relay machines (by George Stibitz et al), Konrad Zuse's first two designs, and so on). Influence The use to which the Colossi were put was of the highest secrecy, and the Colossus itself was highly secret. It therefore had little influence on the development of later computers; being unknown. EDVAC was the early design which had the most influence on subsequent computer architecture. Colossus documentation and hardware were classified from the moment of their creation and remained so after the War. After the war it is said that Winston Churchill specifically ordered the destruction of the Colossus machines into 'pieces no bigger than a man's hand' and that Tommy Flowers personally burned blueprints in a furnace at Dollis Hill. However two machines continued in use after the war at GCHQ in Cheltenham, until their destruction in the 1960s. Information about Colossus emerged publicly in the late 1970s after the secrecy imposed by the Official Secrets Act ended in Thus, Colossus could not be included in the history of computing hardware for many years. Newman and his associates also were deprived of the recognition they were due. A 500 page technical report on Colossus and Colossus II, entitled General Report on Tunny was released by GCHQ to the national Public Record Office in October 2000; a section is available online [1] ( Reconstruction In May 2004, the construction of a replica of a Colossus Mk II was completed by a team led by Tony Sale. It currently is on display in the Bletchley Park Museum in Milton Keynes, Buckinghamshire. See also History of computing hardware Z3 Supercomputer References Harvey G. Cragon, "From Fish to Colossus: How the German Lorenz Cipher was Broken at Bletchley Park", July ISBN Tony Sale, The Colossus Computer : How It Helped to Break the German Lorenz Cipher in WWII, October (20 pages). ISBN

8 Howard Aiken and the Harvard Mark I
[ ] realized Babbage’s vision Automatic Sequence Controlled Calculator built by IBM for Harvard Univ. decimal arithmetic used electromagnetic relays could perform all 4 arithmetic operation (3-5 seconds for a multiplication; 0.3s for addition) separate memories for instruction and data (Harvard architecture) Howard Hathaway Aiken was born on March in Hoboken, New Jersey, USA, the same year German inventor Konrad Zuse was born in Berlin. Aiken started his studies at the University of Wisconsin, Madison, but in 1939 he got his Ph.D. in Physics from Harvard. As early as 1936, he still was a student and worked as an instructor, Aiken started making plans for a calculating device that had to be able to give answers to problems he ran into when developing a system of differential equations. He needed those when working on space charge conduction in vacuum tubes. So, he proposed the construction of a large calculating device. As a result of discussions, it appeared that "something similar" had been tucked away in the Science Attic. This "something similar" proved to be a remnant of Babbage’s unfinished Differential Engine, just over a 100 years old at the time. This discovery of Babbage’s brass wheels proved to be a turning point in the development of Aiken. Later, he got several books by Babbage, given to him by Babbage’s grandson. There, Aiken found what he had been thinking himself all the time. Now he was certain that the construction he had in mind was the right one. Aiken had set out several conditions: the machine had to be able to handle both positive and negative numbers, it had to calculate logarithms, sines, cosines and various other scientific functions. At first, Monroe Calculating Machine Co. was approached but they were not interested, so, Harvard contacted IBM. IBM agreed to build the machine. Estimated cost: $ IBM would pay. Construction of the machine started in 1937 en Aiken and his staff of two IBM engineers worked on it continuously till the end of 1943 when the IBM Automatic Sequence Controlled Calculator  was completed under supervision of Robert Campbell in an IBM plant in Endicott, N.Y.(IBM's first plant, recently sold). in the final year Aiken was helped by a brilliant young student, Mrs. Grace Hopper

9 Admiral Grace Brewster Murray Hopper
[ ] developed the first compiler, A-0, for a computer program Rear Admiral Grace Brewster Murray Hopper (born Grace Brewster Murray) (December 9, January 1, 1992) was an early computer programmer and the developer of the first compiler for a computer programming language. The compiler was known as the A compiler and its first version was A-0. Later versions were released commercially as the ARITH-MATIC, MATH-MATIC and FLOW-MATIC compilers. Courtesy of the Naval Surface Warfare Center, Dahlgren, VA., 1988

10 J. Presper Eckert and John Mauchly
UPenn Moore School of Elec. Eng. ENIAC, commissioned by the US Army for ballistic applications, was the first all-electronic general- purpose computer decimal arithmetic (not binary) ! re-program through rewiring ! 1,000 times faster than Mark I 30 tons, 167 m2, 160kW power A modern silicon chip with equivalent processing power would be only 0.02in2 Getting the machine ready to compute just one ballistic firing table required an average of two days of manual labor (rewiring). ENIAC = revolutionary in terms of speed and power, but its major flaw was a faster memory access. How much really general purpose? Due to re-wiring it was fairly inflexible, but ok for its target application John William Mauchly (August 30, 1907 – January 8, 1980) was an American physicist and computer engineer who, along with J. Presper Eckert, designed ENIAC, the first general-purpose electronic digital computer, and UNIVAC I, the first commercial computer made in the United States. He was born in Cincinnati, Ohio and died in Ambler, Pennsylvania. John Presper Eckert, a computer pioneer, was born April 9, 1919 in Philadelphia and died June 3, 1995 in Bryn Mawr, Pennsylvania. ENIAC, short for Electronic Numerical Integrator And Computer, was the first all-electronic computer designed to be Turing-complete, capable of being reprogrammed by rewiring to solve a full range of computing problems. It was preceded in 1941 by the fully tape-programmable but still mechanical Z3 designed by Konrad Zuse and by the all- electronic rewire to reprogram but not fully general purpose British Colossus computer. Both ENIAC and Colossus used thermionic valves, that is, vacuum tubes, while Z3 used mechanical relays. The requirement to rewire to reprogram ENIAC was removed in 1948. ENIAC was developed and built by the U.S. Army for their Ballistics Research Laboratory with the purpose of calculating ballistic firing tables. ENIAC was conceived of and designed by J. Presper Eckert and John William Mauchly of the University of Pennsylvania. The computer was commissioned on May 17, 1943 as Project PX, constructed at the Moore School of Electrical Engineering from mid-1944, and formally operational from February having cost almost $500,000. It was then shut off on November 9, 1946 for a refurbishment and a memory upgrade. ENIAC was unveiled on February 14, 1946 at the University of Pennsylvania and was transferred to the Aberdeen Proving Grounds, Maryland in There, on July 29th of that year, it was turned on and would be in continuous operation until 1955. ENIAC received a lot of press for its sheer size, but in some ways it was not the state-of-the-art of its era. Unlike Konrad Zuse's Z3 of 1941 and Howard Aiken's MARK I of 1944 it had to be rewired to run a new program (Z3 and MARK I read their programs off a tape). Furthermore, unlike Z3 and most modern computers, ENIAC's registers performed decimal arithmetic rather than binary. ENIAC used ten-position ring counters to store digits. Arithmetic was performed by "counting" pulses with the ring counters and generating carry pulses if the counter "wrapped around", the idea being to emulate in electronics the operation of the digit wheels of a mechanical adding machine. Each of ENIAC's twenty ten-digit signed accumulators could perform 5,000 simple addition operations every second (total 100,000 addition operations per second). The ENIAC could only manage 357 multiplication operations per second or 38 division (or square root) operations per second. Physically ENIAC was a monster—it contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. It weighed 30 tons, was roughly 2.4 m by 0.9 m by 30.5 m, took up 167 m² and consumed 160 kW of power. Input was possible from an IBM card reader, while an IBM printer could produce printed output. ENIAC used common octal-base radio tubes of the day; the decimal accumulators were made of 6SN7 flip-flops, while 6L7s, 6SJ7s, 6SA7s and 6AC7s were used in logic functions. Numerous 6L6s and 6V6s served as line drivers to drive pulses through cables between rack assemblies. Some electronics experts predicted that tube failures would occur so frequently that the machine would never be useful. This prediction turned out to be partially correct: several tubes burned out almost every day, leaving it nonfunctional about half the time. Special high-reliability tubes were not available until 1948, so Eckert and Mauchly had to use commonplace tube types. Most of these failures, however, occurred during the warm-up and cool-down periods, when the tube heaters and cathodes were under the most thermal stress. By the simple (if expensive) expedient of never turning the machine off, the engineers reduced ENIAC's tube failures to the more acceptable rate of one tube every two days. In 1954, the longest continuous period of operation without a failure was 116 hours (close to five days). Given the technology available at the time, this failure rate was remarkably low, and stands as a tribute to the precise engineering of ENIAC. Eckert and Mauchly took the experience they gained and founded the Eckert-Mauchly Computer Corporation, producing their first computer, BINAC, in 1949 before being acquired by Remington Rand in 1950 and renamed as their Univac division. ENIAC ran until October 2, It was a one-off design and was never repeated. The freeze on design in 1943 meant that the computer had a number of short-comings which were not solved, notably the inability to store a program. But the ideas generated from the work and the impact it had on people such as John von Neumann were profoundly influential in the development of later computers, initially EDVAC, EDSAC and SEAC. A number of improvements were also made to ENIAC from 1948, including a primitive read-only stored programming mechanism [1] ( using the Function Tables as program ROM, an idea proposed by John von Neumann. This modification reduced the speed of ENIAC by a factor of 6 times, but as it also reduced the reprogramming time to hours instead of days, it was considered well worth the loss of performance. As of 2004, a chip of silicon measuring 0.02 inches square holds the same capacity as the ENIAC, which occupied a large room

11

12

13 John von Neumann [1903-1957] 1945 Hungarian-American mathematician
Worked with a group at Moore School to write “First Draft of a Report of EDVAC”, describing the architecture of a stored-program computer a single storage structure hold both the set of instruction on how to perform the computation and the data required/generated by the computation implicit separation of memory from processing unit von Neumann was the only author listed on the paper Stored-program => subroutines!! Goldstine considers the report “the most important document ever written on computing and computers” In fact, it “was essentially the mathematical-logic formulation of the ideas of the members of what was originally understood as an engineering project”….done in a wartime contect, and for the purpose of executing as fast as possible long, prepetitious arithmetic… The merge (contrast?) between von Neumann’s tradition (mathematician-academic) and the more commercially-minded engineers Eckert and Mauchly From Wikipedia, the free encyclopedia. John von Neumann (Neumann János) (December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician who made important contributions in quantum physics, set theory, computer science, economics and virtually all mathematical fields. Von Neumann was invited to Princeton University in 1930, and was one of four people selected for first faculty of the Institute for Advanced Study, where he was a mathematics professor from its formation in 1933 until his death. During the Second World War, von Neumann contributed to the USA's Manhattan Project that built the first atomic bomb From 1936 to 1938 Alan Turing was a visitor at the Institute and completed a Ph.D. dissertation under von Neumann's supervision. This visit occurred shortly after Turing's publication of his 1934 paper "On Computable Numbers with an Application to the Entscheidungs-problem" which involved the concepts of logical design and the universal machine. Von Neumann must have known of Turing's ideas but it is not clear whether he applied them to the design of the IAS Machine ten years later. Von Neumann was the father of game theory and published the classic book Theory of Games and Economic Behavior with Oskar Morgenstern in He conceived the concept of "MAD" (mutually assured destruction), which dominated American nuclear strategy in the Cold War. He worked in the Theory division at Los Alamos along with Hans Bethe and Victor Weisskopf during World War II as part of the Manhattan Project to develop the first atomic weapons. Von Neumann devised the von Neumann architecture used in most non-parallel-processing computers. Virtually every commercially available home computer, microcomputer and supercomputer is a von Neumann machine. He created the field of cellular automata without computers, constructing the first examples of self-replicating automata with pencil and graph paper. The term von Neumann machine also refers to self-replicating machines. Von Neumann proved that the most effective way large-scale mining operations such as mining an entire moon or asteroid belt can be accomplished is through the use of self-replicating machines, to take advantage of the exponential growth of such mechanisms. In addition to his work on architecture, he is credited with at least one contribution the study of algorithms. Donald Knuth cites von Neumann as the inventor, in 1945, of the well known MergeSort algorithm, in which the first and second halves of an array are each sorted recursively and then merged together.

14 von Neumann Architecture


Download ppt "Who invented the computer?"

Similar presentations


Ads by Google