A Bayesian approach to COROT light curves analysis Francisco Jablonski, Felipe Madsen and Walter Gonzalez Instituto Nacional de Pesquisas Espaciais São.

Slides:



Advertisements
Similar presentations
Technische Universität Braunschweig, Institut für Theoretische Physik, Germany Abstract Magnetized extrasolar giant planets.
Advertisements

A Tutorial on Learning with Bayesian Networks
1 Bayesian methods for parameter estimation and data assimilation with crop models David Makowski and Daniel Wallach INRA, France September 2006.
Bayesian Estimation in MARK
Inspiral Parameter Estimation via Markov Chain Monte Carlo (MCMC) Methods Nelson Christensen Carleton College LIGO-G Z.
TNO orbit computation: analysing the observed population Jenni Virtanen Observatory, University of Helsinki Workshop on Transneptunian objects - Dynamical.
Lecture 3 Probability and Measurement Error, Part 2.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Bayesian Reasoning: Markov Chain Monte Carlo
Bayesian statistics – MCMC techniques
BAYESIAN INFERENCE Sampling techniques
Symposium CoRoT 2009 – Cité Universitaire – 2 February 2009 Analysis of power spectra of Sun-like stars using a Bayesian Approach Thierry Appourchaux Symposium.
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Transit Analysis Package Zach Gazak John Tonry John Johnson.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Bayesian Analysis of X-ray Luminosity Functions A. Ptak (JHU) Abstract Often only a relatively small number of sources of a given class are detected in.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Efficient Estimation of Emission Probabilities in profile HMM By Virpi Ahola et al Reviewed By Alok Datar.
Today Introduction to MCMC Particle filters and MCMC
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Bayes Factor Based on Han and Carlin (2001, JASA).
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Adriana V. R. Silva CRAAM/Mackenzie COROT /11/2005.
WSEAS AIKED, Cambridge, Feature Importance in Bayesian Assessment of Newborn Brain Maturity from EEG Livia Jakaite, Vitaly Schetinin and Carsten.
Correlation With Errors-In-Variables3/28/20021 Correlation with Errors-In-Variables and an Application to Galaxies William H. Jefferys University of Texas.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Estimation in Sampling!? Chapter 7 – Statistical Problem Solving in Geography.
Chapter 14 Monte Carlo Simulation Introduction Find several parameters Parameter follow the specific probability distribution Generate parameter.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Simultaneous monitoring observations of solar active regions at millimeter wavelengths at radio telescopes RT-7.5 BMSTU (Russia) and RT-14 Metsahovi radio.
Search for radio emission in extrasolar planets detected by COROT Accepted AP observational requirements, feasability, expectations Walter Gonzalez, Francisco.
Introduction to Inferential Statistics Statistical analyses are initially divided into: Descriptive Statistics or Inferential Statistics. Descriptive Statistics.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
A STEP Expected Yield of Planets … Survey strategy The CoRoTlux Code Understanding transit survey results Fressin, Guillot, Morello, Pont.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
A tool to simulate COROT light-curves R. Samadi 1 & F. Baudin 2 1 : LESIA, Observatory of Paris/Meudon 2 : IAS, Orsay.
Intrinsic Short Term Variability in W3-OH and W49N Hydroxyl Masers W.M. Goss National Radio Astronomy Observatory Socorro, New Mexico, USA A.A. Deshpande,
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Learning Bayesian networks from postgenomic data with an improved structure MCMC sampling scheme Dirk Husmeier Marco Grzegorczyk 1) Biomathematics & Statistics.
Bayesian Hierarchical Modeling for Longitudinal Frequency Data Joseph Jordan Advisor: John C. Kern II Department of Mathematics and Computer Science Duquesne.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Overview G. Jogesh Babu. Overview of Astrostatistics A brief description of modern astronomy & astrophysics. Many statistical concepts have their roots.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
by Ryan P. Adams, Iain Murray, and David J.C. MacKay (ICML 2009)
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Introduction: Metropolis-Hasting Sampler Purpose--To draw samples from a probability distribution There are three steps 1Propose a move from x to y 2Accept.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
DirectFit reconstruction of the Aya’s two HE cascade events Dmitry Chirkin, UW Madison Method of the fit: exhaustive search simulate cascade events with.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
HW7: Evolutionarily conserved segments ENCODE region 009 (beta-globin locus) Multiple alignment of human, dog, and mouse 2 states: neutral (fast-evolving),
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
Introduction to Sampling based inference and MCMC
Lecture 1.31 Criteria for optimal reception of radio signals.
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Filtering and State Estimation: Basic Concepts
Statistical NLP: Lecture 4
Graduate School of Information Sciences, Tohoku University
Parametric Methods Berlin Chen, 2005 References:
Differential Emission Measure
The Nathan Kline Institute, NY
Presentation transcript:

A Bayesian approach to COROT light curves analysis Francisco Jablonski, Felipe Madsen and Walter Gonzalez Instituto Nacional de Pesquisas Espaciais São José dos Campos, SP

Abstract Space missions like COROT will produce a large number of detections of planetary transits. Among the newly detected planetary systems one expects to find those in which radio emission is significantly correlated with events of ejection of matter in the parent star. The planetary radio emission is exponentially related with the velocity and power of the wind from the parent star. In this context, to detect impulsive events with relative amplitude of the order of in the light curve of the parent star is important for prompt triggering of follow-up observations in radiofrequencies. In this work, we investigate the use of a Bayesian approach for the detection of impulsive events.

Introduction Jupiter presents non-thermal emission in the kHz to GHz bands Below 40 MHz  cyclotron emission Higher frequencies  synchrotron emission Average power at the GWatt level Many ESP closer to parent star than Jupiter (d << 1 AU) The energy injected in their magnetospheres may be orders of magnitude larger than in Jupiter, since M PLANET > M Jup and B PLANET > B jup Estimates of the emitted power for extra-solar planets in Bastian et al. (2000), Zarka et al. (2001) e Farrel et al. (2003)

The Radio-Optical connection In Jupiter, the radio emission increases orders of magnitude after events of coronal mass ejection (CME) in the Sun CME produce global irradiance variations ~10 -4 (rms) in the optical COROT photometry can detect CME Real-time monitoring with COROT would allow early warnings to radio-observations to search for radio emission after impulsive events in the parent star  Many targets/events could be observed Alternatively, off-line analysis of COROT data obtained simultaneously with radio data ok  Fewer targets/events

Detection of impulsive events Bayesian approach like in Aigrain & Favata (2002) and Defaÿ et al. (2001)  Poissonic nature of photon noise  Two-rates, ( 1, 2 ), model for quiescence and flares  2 > 1 is an important "a priori" information More elaborate models (with duration and shape of flare included) could be implemented easily Selection of best model is natural in the Bayesian context

The Bayesian approach (1) The bayesian approach allows us to examine the parameters of a model (here represented by a vector  ) as if they were statistical variables of the same nature as the data (represented by a vector D). The connection between the two entities is possible via the Bayes Theorem: Here L(D|  ) is the likelihood of the data given the parameters  ; P(  ) is the distribution of probabilities representing our "a priori" knowledge of the model, and  (  |D) represents the "a posteriori" distribution of probabilities of the parameters. We are interested, in general, in the expected value, or in some measure of the (marginalized) width of  (  |D). (1)

The Bayesian approach (2) The term in the denominator of equation (1) is a normalization factor that does not change the shape of  (  |D). In practice, it can be ignored in the calculations. Equation (1) can be expressed analytically only in very particular cases. When the number of parameters in  is large, one can find the expected value or the width of  (  |D) only by numerical methods. Grid methods, however, are exponentially inefficient with the growth of the number of parameters (MacKay 2003). The best method to efficiently examine  (  |D) is the Markov Chain Monte Carlo Method (Gilks, Richardson & Spiegelhalter 1996). Given a set of parameters , a characteristic feature of the Markov chain is that a state in the space of  depends only on the immediately previous state of .

Implementation of the MCMC 1.Start chain at t=0 with state  0 2.Generate a tentative  ', with proposal transition q(  '|  t ) †. Evaluate 3.Generate an uniform random number U=[0,1] 4.If U   (  t,  ') make  t+1 =  ' (that is, accept the transition) IF U >  (  t,  ') make  t+1 =  t. 5.Increment t 6.Goto step 2 (2) † To simplify, a symmetric q(  '|  t ) was chosen (the Metropolis-Hastings algorithm)

Computational details The likelihood in Eqs. (1) e (2) may be written as: where y i, i=1…N, is the light-curve, and the model for each i. For computational reasons it is better to express the likelihood ratio of Eq. (2) in logarithmic form. In the case of uniform priors we have: The models we examined are: (single step with amplitude  ) (box of amplitude  and duration T)

Results Figure 1 - Simulated light-curve in which a step occurs at index I 0 =381. The signal/noise ratio for the event is 0.7 Figure 2 - The distribution of samples of p(I), that is, the probability that a step occurs at a given location in the light-curve of Fig. 1. It is the distribution  (  |D) marginalized with respect to the amplitude. Figure 3 - The difference between the mode of p(I) in Fig. 2 and the injected value I 0, as a function of the signal/noise ratio of the event. For each value of signal/noise we generated 100 distinct curves with random locations of I 0, so that an average and standard deviation of  I could be calculated.

Discussion Figure 3 shows that the procedure based on sampling of the "a posteriori" distribution  (  |D) for a model in which the impulsive event is a step is quite good. One can see that there are no trends in the recovered instants even for S/N~0.5 (in this case it is not possible to see the events "by eye" anymore). The uncertainty in the localization of the events at low S/N has obvious importance in the context of generating alerts for subsequent observations in radiofrequencies. The next step to use this method for the detection of impulsive events in more realistic conditions is to introduce a spectrum of intrinsic fluctuations in the simulated light- curves (e.g., from the spectrum of the fluctuations of solar irradiance observed by the VIRGO experiment in SoHO). This would modify substantially the detection thresholds. In this case, it is important to separate the contributions to the global likelihood coming from Poisson noise and from intrinsic fluctuations associated to the signal itself. Another development of interest would be an implementation of the method suitable for use in real-time.

Bibliography Aigrain, S., and Favata, F. 2002, A&A, 395, 625 Bastian, T., Dulk, G.A., and Leblanc, Y. 2000, ApJ, 545, 1058 Defaÿ, C., Deleuil, M., and Barge, P. 2001, A&A, 365, 330 Farrel, W.M., Desch, M.D., Lazio, T.J., Bastian, T. and Zarka, P. 2003, ASP Conf. Ser. 294: Scientific Frontiers in Research on Extrasolar Planets, ed. D. Deming and S. Seager (San Francisco: ASP), 151 Gilks, W.R., Richardson, S. & Spiegelhalter, D.J. 1995, Markov Chain Monte Carlo in Practice, Chapman & Hall/CRC MacKay, D.J.C. 2003, Information Theory, Inference, and Learning Algorithms, Zarka, P. Treumann, R.A., Ryabov, B.P. and Ryabov, V.B. 2001, Ap&SS, 277, 293