Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma.

Similar presentations


Presentation on theme: "Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma."— Presentation transcript:

1 Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma

2 1. What is complexity? Size Variety Structure / INTER-RELATION Complexity Algorithmic Others “That property of a language expression which makes it difficult to formulate its overall behavior even when given almost complete information about its atomic components and their inter-relations” – Bruce Edmonds.1999

3 2. Discussion on definition  Language is meant in general sense. One readable by a Universal Turing machine.  Atomic components are fundamental symbols that cannot be further reduced.  Difficulty of formulation refers directly to complexity.

4 3. What makes system complex? Its large size? Complex system Its constant variation or difference? Its intricate rules giving it defined structure and interconnecting atomic components? Its nature relative to observer?

5 3.A.Grassberger image complexity

6 4. Different information theories Size Variety Structure / INTER-RELATION Kolmogorov Mutual information Kauffman Hieratical Topological Kauffman Logical depth

7 5. How do we measure complexity? Complexity Measure size?Model system? Measure predictability (Entropy)? Measure multiple parameters (use Random matrix theory) ? Measure Interconnections? Be subjective (or) objective? Measure limits of mathematical reasoning and computation?

8 Information complexity Cognitive complexity HCI complexity System Human HCI 6. Layers of complexity

9 7. Information complexity - Deterministic The Kolmogorov-Chaitin complexity K(x) of an object is the length, in bits, of the smallest program (in bits) that when run on a Universal Turing Machine outputs and then halts. The entropy rate H of a system measures its randomness. So we can see that growth rate of K(x) is equal to entropy. K(x) is maximized for randomized strings. K(x) is approx. equal to randomness

10 6. Information complexity - Structural K(x) or entropy do not measure pattern, structure, correlation or organization. How does randomness compare to structural complexity? Mutual information, Topological and Hieratical Utilize combination of computational theory, statistical inference and information theory to explain system.

11 9. Information complexity - Variety Logical depth – of x is the shortest program that will have a UTM produce x and then halt. It is not random! Small for both ordered and random input strings. Logical depth is non constructive – because it is not computable. And, by assuming a UTM one loses the ability to differentiate other systems. Kauffmann’s complexity – number of conflicting constraints.

12 References Complexity and Automation Displays of Air Traffic Control: Literature Review and Analysis. Jing Xing. Carol A. Manning. DOT/FAA/AM-05/4 Measures of Information Complexity and the Implications for Automation Design. Jing Xing. Measures of Complexity. Stephanie Forrest. http://cs.unm.edu/%7Eforrest/cas-class-06.html http://cs.unm.edu/%7Eforrest/cas-class-06.html Computational Complexity by Papadimitriou. Addison-Wesley (1994) Elements of Information Theory by Cover and Thomas. Wiley (1991).


Download ppt "Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma."

Similar presentations


Ads by Google