Claude Elwood Shannon Information Theory Differential Analyzer

Slides:



Advertisements
Similar presentations
DCSP-7: Information Jianfeng Feng Department of Computer Science Warwick Univ., UK
Advertisements

Microcomputer Circuits Prof Jess UEAB 2007 Designing a Microprocessor Chapter 1.
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Entropy and Information Theory
Lecture 2: Systems Engineering
Anuj Arora History of Inventors George Stibitz Pioneer of digital computing and remote job entry.
By : Catherine 7th period. The first computer was made by Germany’s Konrad Zuse in his living room around the first digital computer was made.
Day 2 Information theory ( 信息論 ) Civil engineering ( 土木工程 ) Cultural exchange.
Artificial Intelligence Austin Luczak, Katie Regin, John Trawinski.
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Expose on: « The Shannon-Weaver Model of Communication »
Great Ideas Alan Turing – What is computable? A task is computable if one can specify a sequence of instructions which when followed will result in the.
Great Ideas Alan Turing – What is computable? A task is computable if one can specify a sequence of instructions which when followed will result in the.
Moving Into Automation: The Life of Henry Ford. The Early Years Henry Ford was born on July 30, He was born on a farm near Dearborn, Michigan.
Scientific and Industrial Revolution Word List. Copernicus: ( )- Polish astronomer who concluded that the Earth and planets revolve around the.
Claude Shannon Claude Shannon  Claude Elwood Shannon is the founding father of electronic communications age.  Claude’s parents are Claude.
History of Computers Computer Technology Introduction.
 Some consider the first computer to be the abacus which was invented by the Chinese around 3000B.C. to perform arithmetic processes.  In 1642, Blaise.
Fuzzy Logic BY: ASHLEY REYNOLDS. Where Fuzzy Logic Falls in the Field of Mathematics  Mathematics  Mathematical Logic and Foundations  Fuzzy Logic.
Programming for GCSE Topic 3.3: Boolean Logic and Truth Tables
An Information Theory based Modeling of DSMLs Zekai Demirezen 1, Barrett Bryant 1, Murat M. Tanik 2 1 Department of Computer and Information Sciences,
Famous Mathematician Leonhard Euler Charles Babbage Maggie Chang
Mathematicians By: Dustin Back.
RET 2013: INFORMATION IN RADIO WAVES Ken Jacobs Tim Scaduto.
1 TOPIC 1 INTRODUCTION TO COMPUTER SCIENCE AND PROGRAMMING Topic 1 Introduction to Computer Science and Programming Notes adapted from Introduction to.
Boolean Algebra By Lindsey Curtis & Tomas Filip. Boolean Algebra An abstract mathematical system used to describe relationships between sets Mainly used.
Information theory in the Modern Information Society A.J. Han Vinck University of Duisburg/Essen January 2003
Claude Shannon The Father of Information Theory 1916–2001 Camden OgierCSCE 221 Fall 2014.
Chapter I: Introduction to Computer Science. Computer: is a machine that accepts input data, processes the data and creates output data. This is a specific-purpose.
“An Amazing Mind for Inventions”. Thomas Edison is considered to be the most prolific inventor that ever lived. Edison owned an amazing 1, 368 patents,
John Vincent Atanasoff By: Sofia Revueltas M.. Who was Atanasoff? Inventor of the electronic digital computer. Mathematical Physicist. A professor and.
CSC Intro. to Computing Lecture 5: Boolean Logic, Gates, & Circuits.
Action Ka-men Group History of Computer 1942-ABC Machine.
Standards for Mathematical Practice
عنوان : نظریه اطلاعات شانون تنظیم کنندگان : شکوه ریاضی مینا محمدی فرانک کاظمی مجد.
A Light Shines Bright By Shana Dunn
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Types of Artificial Intelligence & Pioneers in the Field By Vernon Crowder.
A Mathematical Logician By: Hanan Mohammed. Objectives: Who is this guy? Why should we know him? Impact on Mathematical and Logical fields ( )
Information & Communication INST 4200 David J Stucki Spring 2015.
Part II: Artificial Intelligence as Representation and Search
Coding Theory Efficient and Reliable Transfer of Information
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
1 GEK1530 Frederick H. Willeboordse Nature’s Monte Carlo Bakery: The Story of Life as a Complex System.
John Vincent Atanasoff. Information on John Vincent Antanasoff  John Vincent Atanasoff was born in Hamilton, New York October 4,1903 and he died June.
COMM 3353: Communication Web Technologies I M,W,F 1:00PM to 2:00PM 239 COM Shawn W. McCombs M,W,F 1:00PM to 2:00PM 239 COM Shawn W. McCombs
Is This the Dawn of the Quantum Information Age? Discovering Physics, Nov. 5, 2003.
Great Mathematicians By: Allie Heaton 4 th Block.
Claude Shannon What’s his full name and who is his parents? He full name is Claude Elwood Shannon. His father is Claude Elwood Sr. (
Computing via boolean logic. COS 116: 3/8/2011 Sanjeev Arora.
The Telephone By: Kati Michelotti Alexander Graham Bell’s history and who helped  There were two main people involved in the creation of the telephone,
Course Overview  What is AI?  What are the Major Challenges?  What are the Main Techniques?  Where are we failing, and why?  Step back and look at.
Who invented the computer?
COMPUTER SCIENCE Computer science (CS) is The systematic study of algorithmic.
Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel.
Shanon Weaver Model Sujit Kumar Mohanty Assistant Professor Department of Journalism & Mass Communication Central University of Orissa Presentation at:-
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Boolean Algebra Boolean algebra was introduced by George Boole in his first book The Mathematical Analysis of Logic (1847), perfected in the late 19th.
Great Ideas Alan Turing – What is computable? A task is computable if one can specify a sequence of instructions which when followed will result in the.
History of Computers Past and Present.
The Periodic Table  Currently about 118 known elements are known to scientists.
GEORGE BOOLE by 3B s.i.a Boole was born in 1815 in the English industrial town of Lincoln. George Boole's father was a tradesman in Lincoln.
Chapter I: Introduction to Computer Science. Computer: is a machine that accepts input data, processes the data and creates output data. This is a specific-purpose.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Electrical Engineering
Chapter 4: Design and Problem Solving
MODELS OF COMMUNICATION IN RELATION TO MEDIA LITERACY
Unit 1 - Introduction to Matter
Gottfried Wilhelm von Leibniz
Scientific Inquiry Unit 0.3.
Presentation transcript:

Claude Elwood Shannon Information Theory Differential Analyzer Claude E. Shannon was born in Petoskey, Michigan, April 30th, 1916 to Claude Shannon Sr. and Mabel Wolf Shannon. His early years greatly influenced where his life would take him—his deep interest for science came from his grandfather, an established tinkerer and inventor who held a patent on a washing machine. In 1932 Shannon left Gaylord to study at the University of Michigan, and earned two bachelor’s degrees in Mathematics and Electrical Engineering. He later attended MIT as a graduate student and worked for MIT’s Department of Electrical Engineering with Vannevar Bush and his Differential Analyzer. He also went on to teach at MIT. Shannon is best known for his ingenuity when it came to information and how it can be represented, transferred, encrypted and decrypted. Though for all he is credited with, many other fascinating inventions of his go unnoticed—some that have been noticed are his off-center unicycle, a machine called the “Ultimate Machine,” which had a large switch on the side and would reveal an arm from the box that would flip the switch off if it was switched on, and Shannon’s mouse, Theseus. Growing up, Shannon openly admired Thomas Edison, and later discovered that he and Edison are distantly related by the blood of John Ogden (1609-1682), a colonial leader—no wonder he is a talented inventor. After living a undoubtedly fulfilling existence, Shannon died from Alzheimer’s in February 2001, he was 84. Information Theory In simplest terms, information theory is communication by selection, and is only possible through language. Any language allows us to take an object, or thought/mental image, and break it into conceptual chunks. It was Shannon who discovered that information, no matter its form, could be represented using a fundamental unit—the bit. This solved many problems such as sending and receiving data with “noise” as well as decryption and decryption challenges. Another very important aspect of information theory is “channel capacity.” An Obituary from MIT News states, “All communication lines today are measured in bits per second, the notion that Professor Shannon made precise with “channel capacity.”” Using his concept of entropy, he was also able to figure out how much data a message could loose without being distorted by the data loss and transmission. Shannon pioneered the ideas and real world applications of information theory that lead to CD’s, deep-space communications, and how bits are used for storage in computers for pictures, voice streams, and other data. Differential Analyzer The Differential Analyzer was conceived by Vannevar Bush and his students sometime in the 1920’s (finished in 1931), and was made of a complicated system of gears, pulleys and rods. Naturally, Shannon’s interested was piqued at the complexity and motion of this machine. Unlike modern computers, this machine didn’t represent mathematical variables with 1’s and 0’s, rather with the continuous physical motion of the rods. As Shannon spent time with this machine—he maintained it and programmed it for other scientists to use—he discovered that the relay’s that the machine used closely resembled symbolic logic. Each physical switch was either open or closed—a concept exactly matching a binary standard in logic. This association lead to the spark that made everything we take for granted about information—Shannon confirms, “…I realized that Boolean algebra was just the thing to take care of relay circuits and switching circuits.” By developing these concepts further, Shannon theorized and later proved the concepts of “channel capacity,” the ‘bit,’ and Entropy. Channel capacity is straight forward: it is the maximum amount of information a certain method of transmission of data can transfer at one time, and uses the bit as its unit of measurement (my house supposedly gets 7.5 megabits per second). The bit, represented as 1’s and 0’s in a computer, is congruent with “yes or no,” as in Boolean algebra. Entropy is the name given to the concept developed by Shannon in order to precisely calculate how much information could be lost from a message without distorting that message and can be thought of as a sort of ‘information scale.’ Tid-Bits Shannon co-authored an article called “Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” which bolstered the interest in AI and was the first appearance of the term. A man named Henry Quastler calculated that the information quantity (H) contained in a human was approximately 2*1028 bits.

Sources http://en.wikipedia.org/wiki/Claude_Shannon http://www.geni.com/people/Claude-Shannon/6000000026778602980 “What is Information Theory” https://www.khanacademy.org/computing/computer-science/informationtheory/info-theory/v/intro-information-theory http://www.encyclopedia.com/topic/Claude_Elwood_Shannon.aspx http://newsoffice.mit.edu/2001/shannon http://www.technologyreview.com/featuredstory/401112/claude-shannon-reluctant-father-of-the-digital-age/ http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html