Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Saturday, Oct. 12, 2001 Haipeng.

Slides:



Advertisements
Similar presentations
Finding Gold In The Forest …A Connection Between Fractal Trees, Topology, and The Golden Ratio.
Advertisements

Algorithms and applications
What is a Fractal? A fractal is a mathematical object that is both self-similar and chaotic. self-similar: As you magnify, you see the object over and.
Learning with Missing Data
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Graphical Models - Inference - Wolfram Burgard, Luc De Raedt, Kristian Kersting, Bernhard Nebel Albert-Ludwigs University Freiburg, Germany PCWP CO HRBP.
Lauritzen-Spiegelhalter Algorithm
Exact Inference in Bayes Nets
Learning: Parameter Estimation
Dynamic Bayesian Networks (DBNs)
Play the Chaos Game Learn to Create Your Own Fractals.
Chaos, Communication and Consciousness Module PH19510 Lecture 15 Fractals.
Probabilistic networks Inference and Other Problems Hans L. Bodlaender Utrecht University.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
FRACTALS. WHAT ARE FRACTALS? Fractals are geometric figures, just like rectangles, circles, and squares, but fractals have special properties that those.
Bayesian Networks - Intro - Wolfram Burgard, Luc De Raedt, Kristian Kersting, Bernhard Nebel Albert-Ludwigs University Freiburg, Germany PCWP CO HRBP HREKG.
Graphical Models - Inference -
Chapter 9: Recursive Methods and Fractals E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley Mohan Sridharan Based on Slides.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
Graphical Models - Inference - Wolfram Burgard, Luc De Raedt, Kristian Kersting, Bernhard Nebel Albert-Ludwigs University Freiburg, Germany PCWP CO HRBP.
. Bayesian Networks Lecture 9 Edited from Nir Friedman’s slides by Dan Geiger from Nir Friedman’s slides.
Course Website: Computer Graphics 11: 3D Object Representations – Octrees & Fractals.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Sampling Distributions
CS 4731: Computer Graphics Lecture 5: Fractals Emmanuel Agu.
Approaches To Infinity. Fractals Self Similarity – They appear the same at every scale, no matter how much enlarged.
"Clouds are not spheres, mountains are not cones, coastlines are not circles, and bark is not smooth, nor does lightning travel in a straight line."(Mandelbrot,
1 Excursions in Modern Mathematics Sixth Edition Peter Tannenbaum.
Introduction Introduction: Mandelbrot Set. Fractal Geometry ~*Beautiful Mathematics*~ FRACTAL GEOMETRY Ms. Luxton.
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
INTEGRALS Areas and Distances INTEGRALS In this section, we will learn that: We get the same special type of limit in trying to find the area under.
FRACTALS Dr. Farhana Shaheen Assistant Professor YUC.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Statistical Applications Binominal and Poisson’s Probability distributions E ( x ) =  =  xf ( x )
Copyright © Cengage Learning. All rights reserved. CHAPTER 7 FUNCTIONS.
Fractals smooth surfaces and regular shapes - Euclidean-geometry methods -object shapes were described with equations natural objects - have irregular.
Time series Decomposition Farideh Dehkordi-Vakil.
COMP 538 Reasoning and Decision under Uncertainty Introduction Readings: Pearl (1998, Chapter 1 Shafer and Pearl, Chapter 1.
Strategies and Rubrics for Teaching Chaos and Complex Systems Theories as Elaborating, Self-Organizing, and Fractionating Evolutionary Systems Fichter,
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Real-Time AI Wednesday, March 14, 2001 Haipeng Guo.
Fractals What do we mean by dimension? Consider what happens when you divide a line segment in two on a figure. How many smaller versions do you get? Consider.
FUNCTIONS AND MODELS 1. The fundamental objects that we deal with in calculus are functions.
Fractal Project Mariellen Hemmerling. Fractals “A fractal is "a rough or fragmented geometric shape that can be split into parts, each of which is (at.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
CS498-EA Reasoning in AI Lecture #10 Instructor: Eyal Amir Fall Semester 2009 Some slides in this set were adopted from Eran Segal.
1 GEM2505M Frederick H. Willeboordse Taming Chaos.
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
MAT119 Asst. Prof. Ferhat PAKDAMAR (Civil Engineer) M Blok - M106 Gebze Technical University Department of Architecture Fall – 2015_2016.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
Fractals. What do we mean by dimension? Consider what happens when you divide a line segment in two on a figure. How many smaller versions do you get?
Basic Theory (for curve 01). 1.1 Points and Vectors  Real life methods for constructing curves and surfaces often start with points and vectors, which.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Lecture 8: Measurement Errors 1. Objectives List some sources of measurement errors. Classify measurement errors into systematic and random errors. Study.
Fractals Cassi Blum.
Fractals.
The expected value The value of a variable one would “expect” to get. It is also called the (mathematical) expectation, or the mean.
Creating a Hat Curve Fractal Objectives: 1.To create a Hat Curve fractal on Geometer’s Sketchpad using iteration. 2.To find the length of the Hat Curve.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11 CS479/679 Pattern Recognition Dr. George Bebis.
Iterative Mathematics
Computer Graphics Lecture 40 Fractals Taqdees A. Siddiqi edu
Econ 3790: Business and Economics Statistics
S.K.H. Bishop Mok Sau Tseng Secondary School
The Wonderful World of Fractals
Fractals What do we mean by dimension? Consider what happens when you divide a line segment in two on a figure. How many smaller versions do you get?
Presentation transcript:

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Saturday, Oct. 12, 2001 Haipeng Guo KDD Research Group Department of Computing and Information Sciences Kansas State University Fractal and Bayesian Networks Inference KDD Group Presentation

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Presentation Outline Simple Tutorial on Fractals Bayesian Networks Inference Review Joint Probability Space’s Fractal Property and its Possible Application to BBN Inference Summary

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Part I Introduction to Fractals

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Fractals Introduction Definition Examples: Man-made & Nature Fractals Fractal Dimension Fractal Applications

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference “ I coined fractal from the Latin adjective fractus. The corresponding Latin verb frangere means "to break" to create irregular fragments. It is therefore sensible - and how appropriate for our need ! - that, in addition to "fragmented" (as in fraction or refraction), fractus should also mean "irregular", both meanings being preserved in fragment. ” Fractal – “broken, fragmented, irregular” B. Mandelbrot : The fractal Geometry of Nature, 1982

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Definition: Self-similarity –A geometric shape that has the property of self-similarity, that is, each part of the shape is a smaller version of the whole shape. Examples:

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Mathematical fractal: Konch Snowflake Step One. Start with a large equilateral triangle. Step Two. Make a Star. 1.Divide one side of the triangle into three parts and remove the middle section. 2. Replace it with two lines the same length as the section you removed. 3. Do this to all three sides of the triangle. Repeat this process infinitely. The snowflake has a finite area bounded by a perimeter of infinite length !

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Real world fractals A cloud, a mountain, a flower, a tree or a coastline… The coastline of Britain

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Fractal geometry: the language of nature Euclid geometry: cold and dry Nature: complex, irregular, fragmented “Clouds are not spheres, mountains are not cones, coastlines are not circles, and bark is not smooth, nor does lightning travel in a straight line.”

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Euclid dimension In Euclid geometry, dimensions of objects are defined by integer numbers. 0 - A point 1 - A curve or line 2 - Triangles, circles or surfaces 3 - Spheres, cubes and other solids

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Fractal dimension Fractal dimension can be non-integers Intuitively, we can represent the fractal dimension as a measure of how much space the fractal occupies. Given a curve, we can transform it into 'n' parts (n actually represents the number of segments), and the whole being 's' times the length of each of the parts. The fractal dimension is then : d = log n / log s

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Example: Knoch snowflake After the first step, we get four segments(it's then divided into 4 parts). The whole curve is composed of three of these new segments. So, the fractal dimension is : d=log 4/log 3= It takes more space than a 1 dimensional line segment, but it occupies less space than a filled two-dimensional square.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Another example: Cantor Set The oldest, simplest, most famous fractal 1 We begin with the closed interval [0,1]. 2 Now we remove the open interval (1/3,2/3); leaving two closed intervals behind. 3 We repeat the procedure, removing the "open middle third" of each of these intervals 4 And continue infinitely. Fractal dimension: D = log 2 / log 3 = 0.63… Uncountable points, zero length Challenge problem: is ¾ in Cantor set?

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Devil’s staircase Take a Cantor Set, which is composed of an infinite number of points. Consider turning those points into dots and letting a Pacman eat them As our Pacman eats the dots, he gets heavier. Imagine that his weight after eats all the dots is 1. Graph his weight with time, we get devil’s staircase.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Cantor square Fractal dimension: d = log 4 / log 3 = 1.26

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The Mandelbrot Set The Mandelbrot set is a connected set of points in the complex plane Calculate: Z 1 = Z Z 0, Z 2 = Z Z 0, Z 3 = Z Z 0 If the sequence Z 0, Z 1, Z 2, Z 3,... remains within a distance of 2 of the origin forever, then the point Z 0 is said to be in the Mandelbrot set. If the sequence diverges from the origin, then the point is not in the set

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Colored Mandelbrot Set The colors are added to the points that are not inside the set. Then we just zoom in on it

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Applications of fractals Astronomy: the struture of our universe Superclusters - clusters – galaxies- star systems(Solar system) - planets - moons Every detail of the universe shows the same clustering patterns. It can be modeled by random Cantor square The fractal dimension of our universe: 1.23

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Applications of fractals Rings of Saturn Originally, it was believed that the ring is a single one. After some time, a break in the middle was discovered, and scientists considered it to have 2 rings. However, when Voyager I approached Saturn, it discovered that the two ring were also broken in the middle, and the 4 smaller rings were broken as well. Eventually, it identified a very large number of breaks, which continuously broke even small rings into smaller pieces. The overall structure is amazingly similar to... Cantor Set

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Application of Fractals Human Body THE LUNGS: Formed by splitting lines Fractal Canopies The brain : The surface of the brain contains a large number of folds Human, the most intelligent animal, has the most folded surface of the brain Geometrically, the increase in folding means the increase in dimension In humans, it is obviously the highest, being as large as between

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Plants a tree branch looks similar to the entire tree a fern leaf looks almost identical to the entire fern One classic way of creating fractal plants is by means of l- systems(Lindenmayer)

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Bacteria Cultures A bacteria culture is all bacteria that originated from a single ancestor and are living in the same place. When a culture is growing, it spreads outwards in different directions from the place where the original organism was placed. The spreading of bacteria can be modeled by fractals such as the diffusion fractals

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Data Compression A color full-screen GIF image of Mandelbrot Set occupies about 35 kilobytes Formula z = z^2 + c, 7 bytes! (99.98% ) It could work for any other photos as well The goal is too find functions, each of which produces some part of the image. IFS (Iterated function system) is the key.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Weather Weather behaves very unpredictably Sometimes, it changes very smoothly. Other times, however, it changes very rapidly Edward Lorenz came up with three formulas that could model the changes of the weather. These formulas are used to create a 3D strange attractor, they form the famous Lorenz Attractor, which is a fractal pattern.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Fractal Antenna Practical shrinkage of 2-4 times are realizable for acceptable performance. Smaller, but even better performance

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Electronic Transmission Error During electronic transmissions, electronic noise would sometimes interfere with the transmitted data. Although making the signal more powerful would drown out some of this harmful noise, some of it persisted, creating errors during transmissions. Errors occurred in clusters; an period of no errors would be followed by a period with many errors. On any scale of magnification(month, day, hour, 20 minutes, …), the proportion of error- free transmission to error-ridden transmission stays constant. Mandelbrot studied the mathematical process that enables us to create random Cantor dust describing perfectly well the fractal structure of the batches of errors on computer lines

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Network Traffic Model Packets delays gain as a function of time in a WAN environment: the top diagram - absolute values of RTT parameter in virtual channel; the bottom diagram - fractal structure of packets flow that excessed 600 msec threshold.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Fractal Art

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Fractal Summary Fractals are self-similar or self-affine structures Fractal object has a fractal dimension It models many natural objects and processes. It is the nature’s language. It has very broad applications.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Part II Bayesian Networks

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Bayesian Networks Review Bayesian Networks Examples Belief Update and Belief Revision The joint Probability Space and Brute Force Inference

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Bayesian Networks Bayesian Networks, also called Bayesian Belief networks, causal networks, or probabilistic networks, are a network-based framework for representing and analyzing causal models involving uncertainty A BBN is a directed acyclic graph (DAG) with conditional probabilities for each node. –Nodes represent random variables in a problem domain –Arcs represent conditional dependence relationship among these variables. –Each node contains a CPT(Conditional Probabilistic Table) that contains probabilities of this node being specific values given the values of its parent nodes.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Family-Out Example " Suppose when I go home at night, I want to know if my family is home before I try the doors.(Perhaps the most convenient door to enter is double locked when nobody is home.) Now, often when my wife leaves the houses, she turns on an outdoor light. However, she sometimes turns on the lights if she is expecting a guest. Also, we have a dog. When nobody is home, the dog is put in the back yard. The same is true if the dog has bowel problems. Finally, if the dog is in the back yard, I will probably hear her barking(or what I think is her barking), but sometimes I can be confused by other dogs. " Family-Out Light-On Bowel-problem Dog-out Hear-bark

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Asia Example from Medical Diagnostics

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Why is BBN important? Offers a compact, intuitive, and efficient graphical representation of dependence relations between entities of a problem domain. (model the world in a more natural way than Rule-based systems and neural network) Handle uncertainty knowledge in mathematically rigorous yet efficient and simple way Provides a computational architecture for computing the impact of evidence nodes on beliefs(probabilities) of interested query nodes Growing numbers of creative applications

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Alarm Example: the power of BBN PCWP CO HRBP HREKG HRSAT ERRCAUTER HR HISTORY CATECHOL SAO2 EXPCO2 ARTCO2 VENTALV VENTLUNG VENITUBE DISCONNECT MINVOLSET VENTMACH KINKEDTUBE INTUBATIONPULMEMBOLUS PAPSHUNT ANAPHYLAXIS MINOVL PVSAT FIO2 PRESS INSUFFANESTHTPR LVFAILURE ERRBLOWOUTPUT STROEVOLUMELVEDVOLUME HYPOVOLEMIA CVP BP The Alarm network 37 variables, 509 parameters (instead of 2 37 )

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Applications Medical diagnostic systems Real-time weapons scheduling Jet-engines fault diagnosis Intel processor fault diagnosis (Intel); Generator monitoring expert system (General Electric); Software troubleshooting (Microsoft office assistant, Win98 print troubleshooting) Space shuttle engines monitoring(Vista project) Biological sequences analysis and classification ……

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Bayesian Networks Inference Given an observed evidence, do some computation to answer queries An evidence e is an assignment of values to a set of variables E in the domain, E = { X k+1, …, X n } –For example, E = e : { Visit Asia = True, Smoke = True} Queries: –The posteriori belief: compute the conditional probability of a variable given the evidence, P(Lung Cancer| Visit Asia = TRUE AND Smoke = TRUE) = ?  This kind of inference tasks is called Belief Updating –MPE: compute the Most Probable Explanation given the evidence  An explanation for the evidence is a complete assignment { X 1 = x 1, …, X n = x n } that is consistent with evidence. Computing a MPE is finding an explanation such that no other explanation has higher probability  This kind of inference tasks is called Belief revision

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Belief Updating The problem is to compute P(X=x|E=e): the probability of query nodes X, given the observed value of evidence nodes E = e. For example: Suppose that a patient arrives and it is known for certain that he has recently visited Asia and has dyspnea. - What’s the impact that this evidence has on the probabilities of the other variables in the network ? P(Lung Cancer) = ? Visit to Asia Smoking Lung Cancer Tuberculosis tub. or lung cancer Bronchitis X-Ray Dyspnea

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Belief Revision Let W is the set of all nodes in our given Bayesian network Let the evidence e be the observation that the roses are okay. Our goal is to now determine the assignment to all nodes which maximizes P(w|e). We only need to consider assignments where the node roses is set to okay and maximize P(w), i.e. the most likely “state of the world” given the evidence that rose is ok in “this world”. The best solution then becomes - P(sprinklers = F, rain = T, street = wet, lawn = wet, soil = wet, roses = okay) =

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Complexity of BBN Inference Probabilistic Inference Using Belief Networks is NP-hard. [Cooper 1990] Approximating Probabilistic Inference in Bayesian Belief Networks is NP-hard [Dagum 1993] Hardness does not mean we cannot solve inference. It implies that –We cannot find a general procedure that works efficiently for all networks –However, for particular families of networks, we can have provably efficient algorithms either exact or approximate –Instead of a general exact algorithm, we seek for special case, average case, approximate algorithms –Various of approximate, heuristic, hybrid and special case algorithms should be taken into consideration

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference BBN Inference Algorithms Exact algorithms –Pearl’s message propagation algorithm(for single connected networks only) –Variable elimination –Cutset conditioning –Clique tree clustering –SPI(Symbolic Probabilistic Inference) Approximate algorithms –Partial evaluation methods by performing exact inference partially –Variational approach by exploiting averaging phenomena in dense networks(law of large numbers) –Search based algorithms by converting inference problem to an optimization problem, then using heuristic search to solve it –Stochastic sampling also called Monte Carlo algorithms

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Inference Algorithm Conclusions The general problem of exact inference is NP-Hard. The general problem of approximate inference is NP-Hard. Exact inference works for small, sparse networks only. No single champion either exact or inference algorithms. The goal of research should be that of identifying effective approximate techniques that work well in large classes of problems. Another direction is the integration of various kinds of approximate and exact algorithms exploiting the best characteristics of each algorithm.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Part III BBN Inference using fractal? Joint Probability Distributions Space of a BBN Asymmetries in Joint Probability Distributions (JPD) Fractal Property of JPD How does it help to do approximate inference?

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Joint Probability Distribution Asia revisited: - Eight binary nodes - Each node has 2 states: Y or N - Total states: 2 8 = 256 Instance# a b c d e f g h Probability Instance 0: Instance 1 : Instance 2 : Instance 3 : Instance 4 : Instance 5 : Instance 6 : Instance 7 : ……….. Instance 252 : Instance 253 : Instance 254 : Instance 255 : Visit to Asia Smoking Lung Cancer Tuberculosis tub. or lung cancer Bronchitis X-Ray Dyspnea

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference JPD of Asia 256 states Max: Spreads over 9 orders of magnitude 0,1.50E-09, …, Top 30 most likely states cover % of the total probability space. Conclusion: There is usually a small set fraction of states that covers a large portion of the total Probability space with the remaining states having Practically negligible probabilities.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Why is the JPD so skew? When we know nothing about the domain, JPD should be flat. The more we know about a domain, the more asymmetry individual probabilities will show. When the domain and its mechanism are well-known, probability distributions tend to be extreme. Conclusion: Asymmetries in the individual distributions result in joint probability distributions exhibiting orders of magnitude differences in probabilities of various states of the model.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference How does this help inference? Considering only a small number of probabilities of individual states can lead to good approximations in Belief updating. The result can be refined by exploring more high likely states. Problem: Where to locate these “peaks”?

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The global map of JPD of Asia To locate these peaks, let’s first make the map. Nodes order: visit to Asia?|smoking?|tuberculosis?|either tub. or lung cancer?|positive X-ray?|lung cancer?|bronchitis?|dyspnoea? CPT arrangements: let small numbers go first, for example , in order to shift high values to the same area. Most “peaks” are in the second half of the map Clusters of high “peaks”

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Self Similarity (or self affine) Property

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Self Similarity (or self affine) Property Scale first half(256,0-128, 0-64, 0-32)

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Self Similarity (or self affine) Property Scale second half(1-256, , , )

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference More Observations How the skewed “map” is being formed? First, suppose we know nothing about the domain, i.e., all numbers are 0.5 Then, we explore the nodes one by one to set the numbers of its CPT to its actual values (most likely asymmetry, for example, 0.01/0.99, 0.2/0.8). We draw the JPD after each node’s CPT is updated. Thus we get 8 graphs.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The process of getting skew JPD

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Putting together

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference What fractal is this? Cantor Set Recall

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The Devil's Staircases

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The process of building Devil’s Staircase (Asia)

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Multifractal Consider again the Cantor set, but let the original bar have a uniformly distributed mass 1. Suppose the mass is conserved in the bisection process. Suppose at each bisection step, the mass carried by any element is assigned in an asymmetry wat, say, to the left with probability P = x0.25, 0.75x x0.75, 0.75x x0.25x x0.25x0.75 …… …… …… …… 0.25x0.75x0.75,0.75x0.75x0.75

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Random Multifractal? –Both are self affine –Both have clusters of high peaks –More mass distributions assigned to the right end –BBN JPD more irregular, more random, the probability distributions are more extreme.

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The cumulative Mass Distributions Three Devil’s Staircases:

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference Future Problems Construct the proper “random multifractal models” for given Bayesian networks. Locate these “clusters of high peaks” from the fractal model. Once we know that information, other algorithms, for example GA, can be used to explore the local optimal. Identify “Backbone nodes” to help to find out these “high peaks” Use these “high peaks” to do approximate inference. Design an anytime inference algorithm from the above scheme Identify the characteristics of BBNs on which this algorithms work best(should be related to the asymmetry of these BBNs’ CPTs).

Kansas State University Department of Computing and Information Sciences KDD Presentations: Fractals and BBN Inference The End Any Questions ? Thank you!