Presentation is loading. Please wait.

Presentation is loading. Please wait.

Inf 723 Information & Computing

Similar presentations


Presentation on theme: "Inf 723 Information & Computing"— Presentation transcript:

1 Inf 723 Information & Computing
Jagdish S. Gangolly Interdisciplinary PhD Program in Information Science Department of Informatics, College of Computing & Information State University of New York at Albany 1/2/2019 Inf 723 Information & Computing (Gangolly)

2 Inf 723 Information & Computing (Gangolly)
Information Reports Information context (factive, not truth-functional) Information content ( Information carrier Indicating fact 1/2/2019 Inf 723 Information & Computing (Gangolly)

3 Inf 723 Information & Computing (Gangolly)
An example: The acoustic waves from the speaker carry the information that the announcer said, “Nancy Reagan is irritated.” 1/2/2019 Inf 723 Information & Computing (Gangolly)

4 Inf 723 Information & Computing (Gangolly)
Information context: The acoustic waves from Information carrier: the speaker carry the information Information content: that the announcer said, Indicating fact: “Nancy Reagan is irritated.” 1/2/2019 Inf 723 Information & Computing (Gangolly)

5 Inf 723 Information & Computing (Gangolly)
Principles Facts carry information. The informational content of a fact is a true proposition. The information a fact carries is relative to a constraint. The information a fact carries is not an intrinsic property of it. The informational content of a fact can concern remote things and situations. 1/2/2019 Inf 723 Information & Computing (Gangolly)

6 Inf 723 Information & Computing (Gangolly)
Principles Informational content can be specific; the propositions that are informational contents can be about objects that are not part of the indicating fact. Indicating facts contain such information only relative to connecting facts; the information is incremental, given those facts. 1/2/2019 Inf 723 Information & Computing (Gangolly)

7 Inf 723 Information & Computing (Gangolly)
Principles Many different facts, involving variations in objects, properties, relations and spatiotemporal locations, can indicate one and the same informational content—relative to the same or different constraints. Information can be stored and transmitted in a variety of forms. 1/2/2019 Inf 723 Information & Computing (Gangolly)

8 Inf 723 Information & Computing (Gangolly)
Principles Having information is good; creatures whose behavior is guided or controlled by information (by their information carrying states) are more likely to succeed than those which are not so guided. 1/2/2019 Inf 723 Information & Computing (Gangolly)

9 Inf 723 Information & Computing (Gangolly)
To give form to a message by moulding it into a shape or pattern that can be communicated. Measurement, Quantitative Meaning, Qualitative 1/2/2019 Inf 723 Information & Computing (Gangolly)

10 Inf 723 Information & Computing (Gangolly)
Entropy Measure of information content Communications example Claude Shannon’s work 1/2/2019 Inf 723 Information & Computing (Gangolly)

11 Inf 723 Information & Computing (Gangolly)
Entropy Consider the proposition “it will rain tomorrow”. Let p = Probability{It will rain tomorrow} The information content I of a message with probability of occurrence p is I = log (1/p) = -log p 1/2/2019 Inf 723 Information & Computing (Gangolly)

12 Inf 723 Information & Computing (Gangolly)
Entropy I = log (1/p) = -log p If p = 0, I = infinity. In other words, if a priori you consider an even to be impossible, then the message that the event occurred has information content infinity, since you would be very surprised If p = 1, I = 0. In other words, if a priori you consider an even to be a certainty, then the information content of the message that it occurred is 0 since you are not surprised at all. 1/2/2019 Inf 723 Information & Computing (Gangolly)

13 Inf 723 Information & Computing (Gangolly)
Entropy The expected information content of a certain message about the rain is given by: H(p) = p . log (1/p) + (1 - p) . log (1 - p) 1/2/2019 Inf 723 Information & Computing (Gangolly)

14 Inf 723 Information & Computing (Gangolly)
Entropy If p = 0, then H(0) = 0 . log (1/0) log (1/1) = 0 If p = 1, then H(1) = 1 . log (1/1) log (0) = 0 If p = 1/2, then H(1/2) = 1/2 . log (2) + 1/2 . log (2) = 1 1/2/2019 Inf 723 Information & Computing (Gangolly)

15 Inf 723 Information & Computing (Gangolly)
Entropy The information content is usually measured in bits which (take values 0 or 1), and therefore the logarithms in the formulae are to the base 2 However, one can measure such information content in other measurement units such as 10 (decimal) or e (natural). If it is measured in units of 10, they are called Hartleys, named after the statistician who developed the concept. 1/2/2019 Inf 723 Information & Computing (Gangolly)

16 Entropy - An Application in Coding
Weather in California: sunny or cloudy Probabilities: P{sunny}=7/8, P{cloudy}=1/8 Weather reports for two days to be sent The weather on the two consequent days are independent, ie., P(w1 & w2) = P(w1) . P(w2) 1/2/2019 Inf 723 Information & Computing (Gangolly)

17 Entropy - An Application in Coding
Suppose you use code alphabet for weather {0, 1} where 0 stands for sunny and 1 stands for cloudy. . If the two days weather information is to be sent in a message, the average length of the message code is 2 (00, 01, 10, or 11) bits, or 2 .(8/7) (1/7) = 2 1/2/2019 Inf 723 Information & Computing (Gangolly)

18 Entropy - An Application in Coding
The entropy of the message is H = (7/8) .log( 1/(7/8)) + (1/8) .log(1/(1/8)) = 0.54 bits The question is, if we can exploit the fact that the weather on two consecutive days is independent, in order to reduce the average code length og the message. This can be accomplished by the following code: 1/2/2019 Inf 723 Information & Computing (Gangolly)

19 Entropy - An Application in Coding
Weather Probability Code SS (7/8) .(7/8)=49/64 SC (7/8) .(1/8)=7/64 10 CS (1/8) .(7/8)=7/64 110 CC (1/8) .(1/8)=1/64 11 1/2/2019 Inf 723 Information & Computing (Gangolly)

20 Entropy - An Application in Coding
The average length of this code is given by 1/2/2019 Inf 723 Information & Computing (Gangolly)

21 Entropy - An Application in Coding
Therefore, the average length of a daily report is L/2 = 0.5 .(86/64) = 0.68 By considering a larger sequence for the weather report, we have reduced the average length of the report closer to the entropy of 0.54. 1/2/2019 Inf 723 Information & Computing (Gangolly)

22 Inf 723 Information & Computing (Gangolly)
Entropy In the transmission of messages over wires, one can achieve higher reliability by repeating each character in the message by repetition and polling. The greater the repetition the greater the reliability However, such repetition adds overheads to the payload message. At the limit, we can achieve perfect reliability by repeating the characters in the message infinite times, but the message except for the first character would never be sent. 1/2/2019 Inf 723 Information & Computing (Gangolly)

23 Inf 723 Information & Computing (Gangolly)
Entropy Shannon showed that it was possible, by appropriate coding schemes, it is possible to achieve reliability without sacrificing efficiency This is accomplished by two strategies assigning shorter codes to highly likely events grouping events 1/2/2019 Inf 723 Information & Computing (Gangolly)

24 Forms of Information (Bates)
Information is the pattern of organization of matter and energy. All information is natural information, in that it exists in the material world of matter and energy. Represented information is natural information that is encoded or embodied. 1/2/2019 Inf 723 Information & Computing (Gangolly)

25 Forms of Information (Bates)
“an organizing mechanism which provides an ability to deal with the environment. It is a symbolic description having modes of interpreting and interacting with the environment” (Goonatilake) 1/2/2019 Inf 723 Information & Computing (Gangolly)

26 Forms of Information (Bates)
Information flow lineages (Goonatilake) Genetic information Neural-cultural (Experienced, enacted, expressed) Exo-somatic (information stored outside the animal as the “externalization of memories”) (embedded, recorded) Residue (trace) 1/2/2019 Inf 723 Information & Computing (Gangolly)

27 Forms of Information (Bates)
Information as a sign (semiotics) Sign, interpretation, reference to object 1/2/2019 Inf 723 Information & Computing (Gangolly)


Download ppt "Inf 723 Information & Computing"

Similar presentations


Ads by Google