Presentation is loading. Please wait.

Presentation is loading. Please wait.

SYCAMORE CREEK CONSULTANTS Defense Science Education Implications of Complexity Research for Command and Control M. I. Bell FACT, 29 July 2009.

Similar presentations


Presentation on theme: "SYCAMORE CREEK CONSULTANTS Defense Science Education Implications of Complexity Research for Command and Control M. I. Bell FACT, 29 July 2009."— Presentation transcript:

1 SYCAMORE CREEK CONSULTANTS Defense Science Education Implications of Complexity Research for Command and Control M. I. Bell FACT, 29 July 2009

2 2 Disclaimers Most of these ideas are not original; I will not acknowledge my sources I am responsible for any errors; feel free to point them out Complexity can be complicated, even complex I get nothing from the admission charge; no refunds will be given

3 3 Beware of Humpty Dumpty “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean – neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.” “The question is,” said Humpty Dumpty, “which is to be master – that's all.” Care is required when using everyday words for specialized purposes The community of interest needs clear, common definition The general public needs warnings to avoid confusion

4 4 Outline Motivation Some trivial questions (not answers) Intuitive complexity Quantifying complexity Formal complexity: dynamic and architectural Design and control of complex systems Complexity and C2

5 5 Motivation Complexity as a buzzword –“Six degrees of separation,” “butterfly effect,” etc. have entered popular culture –Dozens of university groups, programs, seminars, and projects –Pioneers (e.g., Santa Fe Institute) considering moving on Complexity as a metaphor –98 of 144 papers in the 14th ICCRTS contain the word “complexity” Complexity as a mindset –Awareness of chaos, “fat tails,” “tipping points,” self-organization Complexity as a toolbox –Fractal geometry, nonlinear dynamics, agent-based simulation Complexity as a paradigm “accepted examples of actual scientific practice… [that] provide models from which spring particular coherent traditions of scientific research” – T. S. Kuhn, The Structure of Scientific Revolutions, 1962

6 6 What is Complexity? Many entities, many interactions, collective behavior Quality or quantity? Definition or characteristics? –Emergence, self-organization, self-similarity, chaos, etc. Computational complexity (of a problem) –Resources (typically time) required to obtain a solution Algorithmic information content (of a string) –Length of the shortest program that will output the string Structural complexity –Self-similarity, fractal geometry Dynamic complexity –Chaos, sensitivity to initial conditions, phase transformations

7 7 Why are Things Complex? By selection or by design Selection –Natural or artificial (often not “survival of the fittest” but “the survivors are the fittest”) –Preferential growth (“the rich get richer”) Design –Nonlinearity –Feedback control –Optimization

8 8 Why Do We Care? Emergent behavior (self-organization) Requisite variety (control) Causality (prediction) Stability/instability (cascading failure) Unintended consequences

9 9 Intuitive Complexity Disorganized complexity “a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, …the system as a whole possesses certain orderly and analyzable average properties” Organized complexity “problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole” – W. Weaver, American Scientist (1948)

10 10 Complexity vs. Order PHYSICS Pressure Temperature Phase Statistical Analysis Organized/Differentiated Entities Simple Entities Systems Analysis ECONOMICS GDP Growth rate

11 11 ‘‘Long range detailed weather prediction is therefore impossible, …the accuracy of this prediction is subject to the condition that the flight of a grasshopper in Montana may turn a storm aside from Philadelphia to New York!’’ – W. S. Franklin (1898) Butterfly Effect

12 12 Argument for Quantification “When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind…” – William Thompson (Lord Kelvin), 1824-1907 If we can quantify complexity, we can –Determine whether one system is more or less complex than another –Determine whether a control (or C2) system is of the appropriate complexity for a given situation –Take appropriate steps to control complexity; e.g., Reduce the complexity of our environment Increase the complexity of an adversary’s environment

13 13 Algorithmic Information Content Length of the shortest possible description of a system (made formal using Turing machine concept) Pros: –Consistent with the idea that a good theory simplifies the description of phenomena Cons: –Complexity may seem to be a property of our understanding of a system, not of the system itself –The length of description may depend on the vocabulary available –Relative complexity of two systems depends on the details of the Turing machine used –It is impossible to show that a description is the shortest possible –Random systems are maximally complex (counter-intuitive)

14 14 Computational Complexity The number of operations (typically multiplications) needed to solve a problem Pros: –A complex problem takes longer (or more resources) to solve than a simple one –The difficulty of a complex problem grows rapidly with its size n: Problems that can be solved in time proportional to n k are “polynomial time” problems Problems that can be solved in time proportional to e n or n! are “exponential time” problems Cons: –There is no algorithm for determining how hard a problem is!

15 15 Formal Complexity Dynamic (process) –Corresponds roughly to computational complexity –Originated in non-linear dynamics Architectural (structural) –Corresponds roughly to algorithmic information content –Originated in communication theory

16 16 Mandelbrot Set A complex number c is a member of the set if starting with z 0 = 0, z n+1 = z n 2 + c is bounded B. Mandelbrot, ca. 1978

17 17 Escape Problems Mandelbrot set –A complex number c is a member of the set if starting with z 0 = 0, z n+1 = z n 2 + c is bounded –In other words, c is not a member if z n+1 escapes Sinai billiard –Y. Sanai, ca. 1963 Made into an escape problem by Bleher et al. (1988)

18 18 Sinai Billiard x0x0 00 x 10 5 x 10 6

19 19 Prediction Horizon Discontinuity in boundary conditions (as well as non- linearity) can cause divergent trajectories Similar initial conditions produce similar trajectories for a limited time

20 20 Differential Games Modeling conflict in a dynamical system (e.g., pursuit-evasion) –Each player (two or more) has a state-dependent utility function that he seeks to maximize –Each player has a set of control variables that influence the state of the system –What are the best strategies? –What are the possible outcomes? Example: homicidal chauffeur problem (R. Isaacs, 1951) –The “pedestrian” is slow but highly maneuverable –The “vehicle” is much faster but far less maneuverable –Under what initial conditions (if any) can the pedestrian avoid being run over indefinitely? Some games (complex ones?) generate state-space structures with fractal geometry

21 21 Control Systems ControllerSystem Sensor + – Model Goal ControllerSystem Open Loop Closed Loop

22 22 Aircraft (N=6, N c =3,4) and automobiles (N=3, N c =2) are non-holonomic No stable control settings are possible; not every path can be followed Every possible path can be approximated Control Theory Degrees of freedom (six for an aircraft) (x,y,z) = coordinates of center of mass ( , ,  ) = yaw, pitch, roll Holonomicity N degrees of freedom N c controllable degrees of freedom System is Holonomic if N c = N Non-holonomic if N c < N Redundant if N c > N

23 23 Requisite Variety and Stability Requisite variety (Ashby, 1958) –To control a system with N c controllable degrees of freedom the control system itself must have at least N c degrees of freedom Given requisite variety in the control system for a holonomic system, stability is possible –Lyapunov stability: paths that start near an equilibrium point x e stay near x e forever –Asymptotic stability: paths that start near x e converge to x e –Exponential stability: the convergence is as fast as possible (Lyapunov exponent)

24 24 Internet 2001

25 25 Scale-Free Network Random – A. Barabási, et al. (2000) k = degree (number of connections Power law (  = -1.94) Preferential growth and attachment Diameter (max. distance between nodes) vs. fraction deleted Failure = random node deleted Attack = high-degree node deleted E = random, SF = scale-free World-Wide WebFailure and Attack Tolerance

26 26 Fat Tails

27 27 Cellular Automata Number of live neighbors Action < 2Die 2Do nothing 3Become alive >3Die Game of Life – J. Conway (1970)

28 28 Emergence Emergent objects belong to a higher level of representation than individual cells or their behavior rules Levels (Game of Life): –Cells and rules –Objects (blinkers, gliders, blocks, beehives, etc.) –Interactions of objects (attraction/repulsion, annihilation, etc.) –Architectures of objects (guns, puffers, rakes, etc.) Multiscale Representation (Y. Bar-Yam): each level of representation has its own: –Scale: number of entities or components –Variety: number of possible actions or states Fundamental questions –How is behavior at each level determined? –Can constraints or behaviors at higher levels influence lower ones? –Is there “downward causation”? –Can we design for desired behaviors?

29 29 Gosper’s “Glider Gun”

30 30 Design and Control Systems can become complex either because or in spite of design rules Simplicity is generally a goal, but it competes with other goals: efficiency, robustness, versatility, etc. Systems generally evolve toward greater complexity, not less

31 31 Functional Decomposition Traditional engineering practice Hierarchical structure Independent modules System/subsystem or system (family) of systems

32 32 Commonality

33 33 Reuse

34 34 Big Ball of Mud “A BIG BALL OF MUD is haphazardly structured, sprawling, sloppy, duct-tape and bailing wire, spaghetti code jungle… These systems show unmistakable signs of unregulated growth, and repeated, expedient repair.” “…a complex system may be an accurate reflection of our immature understanding of a complex problem. The class of systems that we can build at all may be larger than the class of systems we can build elegantly, at least at first.” – B. Foote and J. Yoder, in Pattern Languages of Program Design 4 (2000)

35 35 Highly Optimized Tolerance (HOT) “Our focus is on systems which are optimized, either through natural selection or engineering design, to provide robust performance despite uncertain environments. We suggest that power laws in these systems are due to tradeoffs between yield, cost of resources, and tolerance to risks. These tradeoffs lead to highly optimized designs that allow for occasional large events.” “The characteristic features of HOT systems include: (1) high efficiency, performance, and robustness to designed-for uncertainties; (2) hypersensitivity to design flaws and unanticipated perturbations; (3) nongeneric, specialized, structured configurations; and (4) power laws.” – J. M. Carlson and J. Doyle, Physical Review (1999)

36 36 Complexity and C2 Complex systems analysis is not (yet) a revolutionary new paradigm We can use the complexity mindset and toolbox to re-visit and re-assess C2 problems –Speed of command and the OODA loop –Complex endeavors –The DIME/PMESII construct –Wicked problems –The C2 Approach Space –Optimization –Rare events –Emergence and causality

37 37 Speed of Command/Control A B Control: “Correct for cross winds” A B Command: “Fly from A to B C A B Command: “Divert to C

38 38 OODA Loop vs. Control Loop ObserveChoose goal OrientSense error DecideFind correction ActCorrect Traditionally: command is human, control technological Modern control theory describes highly complex behaviors Potential for application to command problems

39 39 Complex Endeavors Complex endeavors have one or more of the following characteristics: –The number and diversity of participants is such that: There are multiple interdependent “chains of command” The objective functions of the participants conflict with one another or their components have significantly different weights The participants’ perceptions of the situation differ in important ways –The effects space spans multiple domains and there is A lack of understanding of networked cause and effect relationships An inability to predict effects that are likely to arise from alternative courses of action – D. Alberts and R. Hayes, Planning: Complex Endeavors (2007) Interpretation as differential games –Utility functions of coalitions (U c = utility function of the coalition, U i = utility function of member i ) –Tight coalition: U c is a fixed function of the individual U i –Loose coalition: U c is a function of the individual U i that depends on the state of the system, allowing gain/loss of commitment, subversion, defection, etc.

40 40 DIME/PMESII Formalism State variables: Political, Military, Economic, Social, Information, Infrastructure Control variables (interventions): Diplomatic, Information, Military, Economic Questions: –Does DIME have requisite variety to control PMESII? –What happens when the game is two-sided? many-sided?

41 41 Competition PMESIIPMESII PMESIIPMESII DIMEDIME DIMEDIME Recent study (AT&L/N81) indicates that available models do not capture essential features – The process by which PMESII state generates DIME interventions –The adversary response and resulting feedback loops

42 42 The “Invisible Hand” Adam Smith: market forces provide closed-loop control of the economy Modern economists: are you kidding? No reason to assume: –Requisite variety in control variables –Stable solutions or attractors in state space Application of game theory: –“Rational actor” assumption limits choices of utility functions –Limited ability to deal with coalitions Similar issues in other PMESII variables

43 43 Wicked Problems 1.There is no definitive formulation of a wicked problem 2.Wicked problems have no stopping rule 3.Solutions to wicked problems are not true-or-false, but good-or-bad 4.There is no immediate and no ultimate test of a solution to a wicked problem 5.Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly 6.Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan 7.Every wicked problem is essentially unique 8.Every wicked problem can be considered to be a symptom of another problem 9.The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution 10.The planner has no right to be wrong – H. Rittel and M. Webber, Policy Sciences (1973)

44 44 No Evolution 1.There is no definitive formulation of a wicked problem 2.Wicked problems have no stopping rule 3.Solutions to wicked problems are not true-or-false, but good-or-bad 4.There is no immediate and no ultimate test of a solution to a wicked problem 5.Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly 6.Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan 7.Every wicked problem is essentially unique 8.Every wicked problem can be considered to be a symptom of another problem 9.The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution 10.The planner has no right to be wrong

45 45 No Design 1.There is no definitive formulation of a wicked problem 2.Wicked problems have no stopping rule 3.Solutions to wicked problems are not true-or-false, but good-or-bad 4.There is no immediate and no ultimate test of a solution to a wicked problem 5.Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly 6.Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan 7.Every wicked problem is essentially unique 8.Every wicked problem can be considered to be a symptom of another problem 9.The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution 10.The planner has no right to be wrong

46 46 Complexity 1.There is no definitive formulation of a wicked problem 2.Wicked problems have no stopping rule 3.Solutions to wicked problems are not true-or-false, but good-or-bad 4.There is no immediate and no ultimate test of a solution to a wicked problem 5.Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly 6.Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan 7.Every wicked problem is essentially unique 8.Every wicked problem can be considered to be a symptom of another problem 9.The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution 10.The planner has no right to be wrong

47 47 Wicked, Complex, or Ill-Posed “In reality the problems are not so much ‘wicked’ as complex.” – E. Smith and M. Clemente, 14 th ICCRTS (2009) “Wicked” problems are best described as differential games –Multiple participants compete to maximize their individual utility functions –Most social policy problems (when described as games) probably are complex, but formal analysis is just starting in biology and economics –The Rittel-Webber description reflects a misguided attempt by the “planner” to define a single utility function (i.e., create a single, tight coalition) –“Wickedness” is not a property of the system but of how we have defined the problem

48 48 C2 Approach Space Three dimensions (D. Alberts and R. Hayes, 2007): –Patterns of interaction –Distribution of information –Distribution of decision rights Incident response model (M. Bell, 14th ICCRTS) –Assumptions (decentralized C2) Decision rights: widely distributed Information: widely distributed Interaction: highly limited –Results (agent-based simulation) Effective “edge” organizations do not have to be near the high end of all three dimensions Self-organization can occur with very simple behavior rules Self-organization can be counter-productive Iterative refinement of the rule set needed to exclude bad cases

49 49 Optimization Optimization of large, non-linear systems is almost always computationally hard (exponential time) Heuristic approaches will sometimes give good approximate solutions Robustness is an issue –Demonstrating stability (to small perturbations) may be computationally hard –Complex systems often have “brittle” optima –The probability of large perturbations may be greatly increased by non-linear dynamics –Extreme optimization (HOT) alters the distribution of properties or behaviors (fat tails)

50 50 Rare Events Not as rare as we might expect –Scale-free (self-similar) structures yield power-law distributions –Probabilities can be many orders of magnitude greater than predicted by the normal distribution Distributions may not be stable (linear combinations of independent events do not have the same distribution as the events) Joint probabilities may not be products of individual event probabilities Increased probability of rare event sequences (cascading failures)

51 51 Causality Complexity research deals with causal (deterministic) systems Opposite of causal is random (not complex) Complexity can: –Make it difficult to discover causal relationships –Limit prediction

52 52 Unintended Consequences When we say that an outcome (or a side- effect) is “unintended,” do we merely mean that it is unanticipated? If we could anticipate (predict) such an outcome or effect, would it necessarily become intended? Does ethical or legal responsibility follow? Can blame be assigned without evidence of predictability?

53 53 Conclusions Complexity research has deep roots in several traditional scientific disciplines It has advanced the state-of-the art in these fields and promoted cross-pollination among them It has been a major enabler in the development of new sub- disciplines (e.g., social network analysis, non-linear dynamics) It has not (yet) yielded a revolutionary new paradigm for scientific research It offers significant potential benefits in C2 research –The mindset and toolbox can be exploited to advance OR and C2 research methodology –Discoveries in other disciplines can be translated into useful insights or partial solutions to C2 problems It does not invalidate any previous work or challenge the goals of C2 research

54 54 Questions or Comments?


Download ppt "SYCAMORE CREEK CONSULTANTS Defense Science Education Implications of Complexity Research for Command and Control M. I. Bell FACT, 29 July 2009."

Similar presentations


Ads by Google