Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington A method for exploiting domain information in parameter estimation Coryn Bailer-Jones Max Planck.

Slides:



Advertisements
Similar presentations
A Crash Course in Radio Astronomy and Interferometry: 4
Advertisements

ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CHAPTER 10: Linear Discrimination
Christoph F. Eick Questions and Topics Review Nov. 22, Assume you have to do feature selection for a classification task. What are the characteristics.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Face Recognition & Biometric Systems Support Vector Machines (part 2)
Computer vision: models, learning and inference
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
x – independent variable (input)
Minimal Neural Networks Support vector machines and Bayesian learning for neural networks Peter Andras
Multivariate Methods of Data Analysis in Cosmic Ray Astrophysics A. Chilingarian, A. Vardanyan Cosmic Ray Division, Yerevan Physics Institute, Armenia.
Jinhui Tang †, Shuicheng Yan †, Richang Hong †, Guo-Jun Qi ‡, Tat-Seng Chua † † National University of Singapore ‡ University of Illinois at Urbana-Champaign.
Die Vermessung der Milchstraße: Hipparcos, Gaia, SIM Vorlesung von Ulrich Bastian ARI, Heidelberg Sommersemester 2004.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Radial Basis Function Networks
This week: overview on pattern recognition (related to machine learning)
Smart RSS Aggregator A text classification problem Alban Scholer & Markus Kirsten 2005.
Machine Learning Seminar: Support Vector Regression Presented by: Heng Ji 10/08/03.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
WB1440 Engineering Optimization – Concepts and Applications Engineering Optimization Concepts and Applications Fred van Keulen Matthijs Langelaar CLA H21.1.
Detection, Classification and Tracking in a Distributed Wireless Sensor Network Presenter: Hui Cao.
Peter-Christian Zinn | Bringing order to chaos | SKANZ 2012 | Auckland, New Zealand
1 Complex Images k’k’ k”k” k0k0 -k0-k0 branch cut   k 0 pole C1C1 C0C0 from the Sommerfeld identity, the complex exponentials must be a function.
SDSS photo-z with model templates. Photo-z Estimate redshift (+ physical parameters) –Colors are special „projection” of spectra, like PCA.
Stellar parameters estimation using Gaussian processes regression Bu Yude (Shandong University at Weihai)
Nicola Da Rio HST Orion Treasury Science Meeting II Baltimore, September 12-13, 2011 A Multi-color optical survey of the Orion Nebula Cluster.
NASSP Masters 5003F - Computational Astronomy Lecture 14 Reprise: dirty beam, dirty image. Sensitivity Wide-band imaging Weighting –Uniform vs Natural.
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
Automated Fitting of High-Resolution Spectra of HAeBe stars Improving fundamental parameters Jason Grunhut Queen’s University/RMC.
Statistical Surfaces Any geographic entity that can be thought of as containing a Z value for each X,Y location –topographic elevation being the most obvious.
Emission Line Galaxy Targeting for BigBOSS Nick Mostek Lawrence Berkeley National Lab BigBOSS Science Meeting Novemenber 19, 2009.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Support-Vector Networks C Cortes and V Vapnik (Tue) Computational Models of Intelligence Joon Shik Kim.
Object classification and physical parametrization with GAIA and other large surveys Coryn A.L. Bailer-Jones Max-Planck-Institut für Astronomie, Heidelberg.
Spatial Point Processes Eric Feigelson Institut d’Astrophysique April 2014.
Numerical Analysis – Data Fitting Hanyang University Jong-Il Park.
M/EEG: Statistical analysis and source localisation Expert: Vladimir Litvak Mathilde De Kerangal & Anne Löffler Methods for Dummies, March 2, 2016.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Stellar Spectrum Analysis for Automated Estimation of Atmospheric Parameter 李乡儒 Collaborators : Ali Luo, Yongheng Zhao, Georges Comte, Fang.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
1 C.A.L. Bailer-Jones. Machine Learning. Model selection and combination Machine learning, pattern recognition and statistical data modelling Lecture 10.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
1 C.A.L. Bailer-Jones. Machine Learning. Unsupervised learning and clustering Machine learning, pattern recognition and statistical data modelling Lecture.
1 C.A.L. Bailer-Jones. Machine learning and pattern recognition Introduction to machine learning and pattern recognition Lecture 1 Coryn Bailer-Jones
1 C.A.L. Bailer-Jones. Machine Learning. Neural networks, search and optimization Machine learning, pattern recognition and statistical data modelling.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
1 C.A.L. Bailer-Jones. Machine Learning. Support vector machines Machine learning, pattern recognition and statistical data modelling Lecture 9. Support.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Neural networks and support vector machines
Data Mining, Neural Network and Genetic Programming
One-layer neural networks Approximation problems
LECTURE 16: SUPPORT VECTOR MACHINES
Machine learning, pattern recognition and statistical data modelling
In summary C1={skin} C2={~skin} Given x=[R,G,B], is it skin or ~skin?
Classification of GAIA data
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
Today (2/23/16) Learning objectives:
Pattern Recognition and Machine Learning
LECTURE 17: SUPPORT VECTOR MACHINES
Theorems about LINEAR MAPPINGS.
Support Vector Machines
CS4670: Intro to Computer Vision
CSCE833 Machine Learning Lecture 9 Linear Discriminant Analysis
Model generalization Brief summary of methods
COSC 4368 Machine Learning Organization
Linear Discrimination
Jiannan Zhang, Yihan Song, Ali Luo NAOC, CHINA
Imaging Topics Imaging Assumptions Photometry Source Sizes Background
Presentation transcript:

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington A method for exploiting domain information in parameter estimation Coryn Bailer-Jones Max Planck Institute for Astronomy Heidelberg Collaborators: Carola Tiede, Kester Smith, Christian Elting MPIA group)

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington The problem ● estimate stellar astrophysical parameters (APs) from multidimensional (spectral) data, given a grid of templates ● strong and weak APs

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington “standard” approaches ● use a labelled grid (e.g. synthetic spectra) ● minimum distance methods (k-nearest neighbours) – what weighting/scaling of data dimensions? – limited by grid resolution ● learn mapping: data APs (global interpolation) – e.g. support vector machines, neural networks,... – implicitly infer sensitivity of the data to the APs

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington 1. It's an inverse problem log(Teff) flux in band

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington 1. It's an inverse problem log(Teff) flux in band theoretically a global model cannot fit this in general

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington 2. We can fit a forward model flux vs. log(T eff ) flux vs. logg

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington Local grid interpolation

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington Local grid interpolation

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington General case: multiple bands and APs

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington Estimate sensitivites

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington ● use local linear J-dimensional interpolation as forward model ● interpolation over neighbours as measured in AP space – chosen to “bracket” the current AP estimate ● coefficients of the plane are the reciprocal sensitivities Local linear variant

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington Applying the model ● J=2 (Teff, logg) 234 templates ● I=11 medium photometric bands, SNR=10 per band ● test set: 233 objects ● exampes of iterations for two objects (blue is truth): iteration number

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington Model performance RMS errors: 1.68 dex in logg in log(T eff ) (480K at 5000K) Cf. SVM: 0.99 dex (320K) ● systematic in weak AP (intrinsic to data – also seen with other methods) ● difficult task: – only 11 bands at SNR=10 – wide AP range, yet sparse grid – sub-optimal local interpolation (linear)

Coryn Bailer-Jones, ADASS XVII, September 2007, Kensington Summary ● global fitting models ignore inverse nature of the mapping ● new approach – local method with local interpolation via fitted forward model – use of sensitivity information for optimal “weighting” of data ● competitive performance on low SNR data (I=11, J=2) ● model under development. Next steps: – (better) nonlinear forward models (smoothness) – local iteration tuning/convergence – extend to higher J (fits are J-dimensional, not I-dimensional) – multiple solutions (degeneracies)