Learning from spectropolarimetric observations A. Asensio Ramos Instituto de Astrofísica de Canarias aasensio.github.io/blog.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Pattern Recognition and Machine Learning
Bayesian inference Lee Harrison York Neuroimaging Centre 01 / 05 / 2009.
Bayesian Belief Propagation
Probabilistic Inverse Dynamics for Nonlinear Blood Pattern Reconstruction Benjamin Cecchetto, Wolfgang Heidrich University of British Columbia.
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Biointelligence Laboratory, Seoul National University
Kriging.
Probability and Statistics Basic concepts II (from a physicist point of view) Benoit CLEMENT – Université J. Fourier / LPSC
Tutorial on Bayesian Techniques for Inference A.Asensio Ramos Instituto de Astrofísica de Canarias.
High Altitude Observatory (HAO) – National Center for Atmospheric Research (NCAR) The National Center for Atmospheric Research is operated by the University.
Segmentation and Fitting Using Probabilistic Methods
Maximum likelihood separation of spatially autocorrelated images using a Markov model Shahram Hosseini 1, Rima Guidara 1, Yannick Deville 1 and Christian.
Visual Recognition Tutorial
Multiple View Geometry
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
Logistic Regression Rong Jin. Logistic Regression Model  In Gaussian generative model:  Generalize the ratio to a linear model Parameters: w and c.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Retrieval Theory Mar 23, 2008 Vijay Natraj. The Inverse Modeling Problem Optimize values of an ensemble of variables (state vector x ) using observations:
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Expectation-Maximization (EM) Chapter 3 (Duda et al.) – Section 3.9
Efficient Quantum State Tomography using the MERA in 1D critical system Presenter : Jong Yeon Lee (Undergraduate, Caltech)
Bootstrapping a Heteroscedastic Regression Model with Application to 3D Rigid Motion Evaluation Bogdan Matei Peter Meer Electrical and Computer Engineering.
A Bidirectional Matching Algorithm for Deformable Pattern Detection with Application to Handwritten Word Retrieval by K.W. Cheung, D.Y. Yeung, R.T. Chin.
CSci 6971: Image Registration Lecture 16: View-Based Registration March 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart,
Regression Eric Feigelson. Classical regression model ``The expectation (mean) of the dependent (response) variable Y for a given value of the independent.
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
Data Selection In Ad-Hoc Wireless Sensor Networks Olawoye Oyeyele 11/24/2003.
UNNOFIT inversion V. Bommier, J. Rayrole, M. Martínez González, G. Molodij Paris-Meudon Observatory (France) THEMIS Atelier "Inversion et transfert multidimensionnel",
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Modern Navigation Thomas Herring
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Operated by Los Alamos National Security, LLC for the U.S. Department of Energy’s NNSA U N C L A S S I F I E D Slide 1 Practical Considerations for Analysis.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Yung-Kyun Noh and Joo-kyung Kim Biointelligence.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Nonlinear force-free coronal magnetic field extrapolation scheme for solar active regions Han He, Huaning Wang, Yihua Yan National Astronomical Observatories,
Bayesian evidence for visualizing model selection uncertainty Gordon L. Kindlmann
INTRODUCTION TO Machine Learning 3rd Edition
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
A Semi-Blind Technique for MIMO Channel Matrix Estimation Aditya Jagannatham and Bhaskar D. Rao The proposed algorithm performs well compared to its training.
July 11, 2006Bayesian Inference and Maximum Entropy Probing the covariance matrix Kenneth M. Hanson T-16, Nuclear Physics; Theoretical Division Los.
A multiline LTE inversion using PCA Marian Martínez González.
Probability and Statistics in Vision. Probability Objects not all the sameObjects not all the same – Many possible shapes for people, cars, … – Skin has.
Machine Learning 5. Parametric Methods.
Inverse Modeling of Surface Carbon Fluxes Please read Peters et al (2007) and Explore the CarbonTracker website.
University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Measuring shear using… Kaiser, Squires & Broadhurst (1995) Luppino & Kaiser (1997)
Using Neumann Series to Solve Inverse Problems in Imaging Christopher Kumar Anand.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
3. Linear Models for Regression 後半 東京大学大学院 学際情報学府 中川研究室 星野 綾子.
CS479/679 Pattern Recognition Dr. George Bebis
Probability Theory and Parameter Estimation I
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Ch3: Model Building through Regression
Latent Variables, Mixture Models and EM
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
10701 / Machine Learning Today: - Cross validation,
Solving an estimation problem
Seam Carving Project 1a due at midnight tonight.
Announcements more panorama slots available now
Mathematical Foundations of BME
Announcements more panorama slots available now
Optimization under Uncertainty
Presentation transcript:

Learning from spectropolarimetric observations A. Asensio Ramos Instituto de Astrofísica de Canarias aasensio.github.io/blog

Learning from observations is an ill-posed problem

Follow these four steps Understand your problem Understand the model that ‘generates’ your data Define a merit function Compute the ‘best’ fit by optimizing or sample this merit function The solution to any model fitting has to be probabilistic

Understand your problem Your data has been obtained with an instrument Your synthetic model might not explain what you see You are surely not understanding your errors Systematics …

Understand your generative model This is the most important and complex part of the inference We assume that x i are fixed and given with zero uncertainty Uncertainty in the measurement is Gaussian with zero mean and diagonal covariance Example Generative model Assumptions

From the generative model to the merit function Likelihood Probability that the measured data has been generated from the model

The standard least-squares fitting comes from the maximization of a Gaussian likelihood Why do we do the  2 fitting?

Some subtleties Weights Do not change the position of the maximum Modify the curvature at the maximum If noise statistics change, modify the likelihood

Errors are Gaussian You know the errors  it is difficult to estimate uncertainties in the errors because errors are already a 2 nd order statistics Errors are only on the y axis  x locations are given with infinite precision The model includes the truth Be aware of the assumptions

Errors are not Gaussian We don’t know the errors Errors are also on the x axis The model does not include the truth Any of our assumptions might be broken What if we break the assumptions?

Without outliers

With outliers We get biased results

If you model the data points and the outliers, you automatically have a generative model and a merit function to optimize Model everything points from the line bad point

Fitting He I Å profiles

Hazel github.com/aasensio/hazel MIT license

Multi-term atom Simplified but realistic radiative transfer effects One or two components (along LOS or inside pixel) Magneto-optical effects MIT license MPI using master-slave scheme Scales almost linearly with N-1 (tested with up to 500 CPUs) Python wrapper for synthesis Assumptions + properties

2p 3 P 3s 3 S 2s 3 S 3p 3 P 3d 3 D Å

Forward modelling

Problems with inversion Robustness Sensitivity to parameters Ambiguities

Step 1Step 2Step 3 Robustness: 2-step inversion DIRECT algorithm (Jones et al 93) 1.Global convergence  DIRECT 2.Refinement  Levenberg-Marquardt

Sensitivity to parameters: cycles Stokes I Stokes Q, U, V Modify weights and do cycles Invert thermodynamical properties ,  v th, v Dopp, … Invert magnetic field vector Cycle 1 Cycle 2

Ambiguities

Ambiguities: off-limb approach In the saturation regime (above ~40 G for He I 10830) Do a first inversion with Hazel Saturation regime  find the ambiguous solutions (<8)

Ambiguities: off-limb approach Do a first inversion with Hazel Saturation regime  find the ambiguous solutions (<8) For each solution, use Hazel to refine the inversion Now almost automatically with Hazel

Where to go from here? Do full Bayesian inversion Model comparison Inversions with constraints Model everything, including systematics, and integrate out nuisance parameters

Bayesian inference PyHazel+PyMultinest

H 0 : simple Gaussian H 1 : two Gaussians of equal width but unknown amplitude ratio Model comparison

H 0 : simple Gaussian H 1 : two Gaussians of equal width but unknown amplitude ratio Model comparison

ln R=2.22  weak-moderate evidence in favor of model 1 Model comparison

Constraints

Central stars of planetary nebulae

B 1,μ 1 B 2,μ 2 B 3,μ 3 b0b0 Model F V Model F V Model F V Bayesian hierarchical model

Are solar tornadoes and barbs the same? Full Stokes He I line at nm (VTT+TIP II) Imaging at the core of the Hα line (VTT - diffraction limited MOMFBD) Imaging at the core of the Ca II K (VTT - diffraction limited MOMFBD) Imaging from SDO Core of the He I line at nm (~0.8’’)

Coincidence with tornadoes in AIA

``Vertical’’ solutions Field inclination

``Horizontal’’ solutions Field inclination

Fields are statistically below 20 G Some regions reach G Filamentary vertical structures in magnetic field strength Magnetic field is robust

Conclusions Be aware of your assumptions Model everything if possible Hazel is freely available Ambiguities can be problematic More work to put chromospheric inversions at the level of photospheric inversions

Announcement IAC Winter School on Bayesian Astrophysics La Laguna, November 3-14, 2014

Radiation field Radiation field anisotropy Solve SEE equations B, , v th, v mac, a Solve RT equation Observed Stokes profiles Emergent profiles  2 smaller than previous? Statistical estimator (  2 ) Save parameters Propose another set of parameters YES NO Convergence? EXIT NO YES Inversion procedure