Rick Quax Postdoctoral researcher Computational Science University of Amsterdam EU FP7 projects: Nudge control through information processing.

Slides:



Advertisements
Similar presentations
Ricard V. Solè and Sergi Valverde Prepared by Amaç Herdağdelen
Advertisements

TOPDRIM: Update WP2 March 2013 Rick Quax, Peter M.A. Sloot.
A Tutorial on Learning with Bayesian Networks
Fixed point iterations and solution of non-linear functions
Feature Selection as Relevant Information Encoding Naftali Tishby School of Computer Science and Engineering The Hebrew University, Jerusalem, Israel NIPS.
Entropy and Information Theory
I NFORMATION CAUSALITY AND ITS TESTS FOR QUANTUM COMMUNICATIONS I- Ching Yu Host : Prof. Chi-Yee Cheung Collaborators: Prof. Feng-Li Lin (NTNU) Prof. Li-Yi.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Dynamic Bayesian Networks (DBNs)
CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Markov processes in a problem of the Caspian sea level forecasting Mikhail V. Bolgov Water Problem Institute of Russian Academy of Sciences.
Overlapping Coalition Formation: Charting the Tractability Frontier Y. Zick, G. Chalkiadakis and E. Elkind (submitted to AAMAS 2012)
CE 428 LAB IV Uncertainty in Measured Quantities Measured values are not exact Uncertainty must be estimated –simple method is based upon the size of the.
A Universal Operator Theoretic Framework for Quantum Fault Tolerance Yaakov S. Weinstein MITRE Quantum Information Science Group MITRE Quantum Error Correction.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
Fundamental limits in Information Theory Chapter 10 :
Distributed Regression: an Efficient Framework for Modeling Sensor Network Data Carlos Guestrin Peter Bodik Romain Thibaux Mark Paskin Samuel Madden.
Impact of Different Mobility Models on Connectivity Probability of a Wireless Ad Hoc Network Tatiana K. Madsen, Frank H.P. Fitzek, Ramjee Prasad [tatiana.
Dynamics of Learning & Distributed Adaptation PI: James P. Crutchfield, Santa Fe Institute Second PI Meeting, April 2001, SFe Dynamics of Learning:
Environmental Data Analysis with MatLab Lecture 7: Prior Information.
Noise, Information Theory, and Entropy
Linear Prediction Problem: Forward Prediction Backward Prediction
Noise, Information Theory, and Entropy
STATISTIC & INFORMATION THEORY (CSNB134)
Professor Walter W. Olson Department of Mechanical, Industrial and Manufacturing Engineering University of Toledo Solving ODE.
Stochastic Population Modelling QSCI/ Fish 454. Stochastic vs. deterministic So far, all models we’ve explored have been “deterministic” – Their behavior.
Introduction to Geophysical Inversion Huajian Yao USTC, 09/10/2013.
Lecture 1 Signals in the Time and Frequency Domains
Review of modern noise proof coding methods D. Sc. Valeri V. Zolotarev.
ENEE 704 Summary Final Exam Topics. Drift-Diffusion 5 Equations, Five Unknowns. – n, p, Jn, Jp,  Solve Self-Consistently Analytical Method: – Equilibrium:
Inferring Decision Trees Using the Minimum Description Length Principle J. R. Quinlan and R. L. Rivest Information and Computation 80, , 1989.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Making Simple Decisions
Curve-Fitting Regression
Mean Field Variational Bayesian Data Assimilation EGU 2012, Vienna Michail Vrettas 1, Dan Cornford 1, Manfred Opper 2 1 NCRG, Computer Science, Aston University,
An Analysis of Phase Noise and Fokker-Planck Equations Hao-Min Zhou School of Mathematics Georgia Institute of Technology Partially Supported by NSF Joint.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Dr. Engr. Sami ur Rahman Data Analysis Correlational Research.
A new Ad Hoc Positioning System 컴퓨터 공학과 오영준.
Oscillation Analysis of Some Hybrid Dynamical Systems of Transmission Pipelines Olena Mul ( jointly with Volodymyr Kravchenko) Ternopil Ivan Pul'uj National.
Coding Theory Efficient and Reliable Transfer of Information
Two Main Uses of Statistics: 1)Descriptive : To describe or summarize a collection of data points The data set in hand = the population of interest 2)Inferential.
1 Private codes or Succinct random codes that are (almost) perfect Michael Langberg California Institute of Technology.
Analyzing the Vulnerability of Superpeer Networks Against Attack Niloy Ganguly Department of Computer Science & Engineering Indian Institute of Technology,
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
Quality of model and Error Analysis in Variational Data Assimilation François-Xavier LE DIMET Victor SHUTYAEV Université Joseph Fourier+INRIA Projet IDOPT,
Cosmological Model Selection David Parkinson (with Andrew Liddle & Pia Mukherjee)
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Analyzing wireless sensor network data under suppression and failure in transmission Alan E. Gelfand Institute of Statistics and Decision Sciences Duke.
§3.6 Newton’s Method. The student will learn about
Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
Presented by Minkoo Seo March, 2006
Dynamic Control of Coding for Progressive Packet Arrivals in DTNs.
DEPARTMENT/SEMESTER ME VII Sem COURSE NAME Operation Research Manav Rachna College of Engg.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
Information and Statistics in Nuclear Experiment and Theory - Introduction D. G. Ireland 16 November 2015 ISNET-3, ECT* Trento.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 14: Applications of Filters.
Analysis of Solutions of 2 nd Order Stochastic Parabolic Equations. Erik Schmidt. University of Wyoming Department of Mathematics. WY NASA Space Grant.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
Mean Field Methods for Computer and Communication Systems Jean-Yves Le Boudec EPFL Network Science Workshop Hong Kong July
Chapter 01 Understanding Hospitality Information Systems and Information Technology 石岳峻 博士.
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
4.3 Feedforward Net. Applications
Towards Next Generation Panel at SAINT 2002
Filtering and State Estimation: Basic Concepts
Calculating the entropy on-the-fly
Presentation transcript:

Rick Quax Postdoctoral researcher Computational Science University of Amsterdam EU FP7 projects: Nudge control through information processing

2 Entropy of a coin flip The outcome of the coin flip carries 1 bit of information The outcome of the coin flip carries 0 bits of information I.e., I need 1 bit to fully describe the outcome of the coin flip I.e., I need 0 bits to fully describe the outcome of the coin flip (it is already known beforehand) In general: Entropy of coin flip: Shannon’s information theory

3 Mutual information X = 0 or 1? Transform Noise (Non-linear) function … Communication channel inference How much information was transferred? Examples: 1 bit is transferred (perfect transmission) 0 bits were transferred (no transmission)

4 Mutual information X = 0 or 1? Transform Noise (Non-linear) function … Communication channel inference How much information was transferred? In general: A priori uncertainty Remaining uncertainty after knowing Y In direct formula:

5 Mutual information X = 0 or 1? Transform Noise (Non-linear) function … Communication channel inference How much information was transferred? In general: A priori uncertainty Remaining uncertainty after knowing Y Assuming p(X=x) = 0.5

Nudge control: information flow … … …

Information integration Transform Communication channel Computation! i.i.d.

Nudge control using information flow

Nudge control: information flow … … Assuming p(X=x) = 0.5 Trans form Communication channelCausal relation (stochastic):

Information integration Control has 100% efficiency: (All entropy in Z comes from controller β ’) doesn’t work works perfect

Information flow in networks AB C D Information dissipation length Information dissipation time Most influential node in network? ?

Relation to causal structure Causal structure Information flows If I nudge X to X+dX then I can solve dY, i.e., I will know exactly how Y changes due to X If I nudge X then I can estimate |dY|, i.e., I know the impact magnitude but not form Approximation which is easier to obtain! SEM

Obstacle: ambiguity Information flow is not always uniquely identifiable Information from X flows into C… But does it flow through A or through B (or both)? I(C:A) > 0 and I(C:B) > 0, even if A would not causally influence C at all, due to correlation (‘information overlap’)… B A X C

Solution: disambiguate by nudging Nudge A by adding small perturbation dA I(C:dA) > 0? I(C:dB) > 0? After all correlations are resolved, information flow = causality (magnitude) B A X C dA

Why ‘nudging’? Direct formula of mutual information: It is made up of probability densities If the control is small, then and we can assume that the calculated information flows still predicts impact If the control is too large, then information flow may change completely But it’s ok… A single nudge can still have large effect due to non-linearities (unlike linear models) A series of persistent nudges can gradually transform system to desired behavior

Conclusion Control theory requires a causality structure Causality structure requires solving the system dynamics For complex systems this is generally impossible We approximate causality structure by information flow structure Disambiguate by nudge controllers where necessary Information flow predicts impact magnitude of a small perturbation (nudge) Applications include ecosystems, medical disorders, transport systems Low-cost, minimum side effects

The Sophocles project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° Thanks! Questions?

Information storage