Principle Components & Neural Networks How I finished second in Mapping Dark Matter Challenge Sergey Yurgenson, Harvard University Pasadena, 2011.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Artificial Neural Networks (1)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Neural Networks: Backpropagation algorithm Data Mining and Semantic Web University of Belgrade School of Electrical Engineering Chair of Computer Engineering.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Overview over different methods – Supervised Learning
OLS REGRESSION VS. NEURAL NETWORKS VS. MARS A COMPARISON R. J. Lievano E. Kyper University of Minnesota Duluth.
Introduction to Neural Networks Simon Durrant Quantitative Methods December 15th.
An Introduction To The Backpropagation Algorithm Who gets the credit?
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Radial Basis Function Networks
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
Radial Basis Function Networks
Radial Basis Function Networks
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
Neural Networks John Riebe and Adam Profitt. What is a neuron? Weights: Weights are scalars that multiply each input element Summer: The summer sums the.
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Neural Network Tool Box Khaled A. Al-Utaibi. Outlines  Neuron Model  Transfer Functions  Network Architecture  Neural Network Models  Feed-forward.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
NEURAL NETWORKS FOR DATA MINING
Neural Networks1 Introduction to NETLAB NETLAB is a Matlab toolbox for experimenting with neural networks Available from:
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
1.  Experimental Results  ELM  Weighted ELM  Locally Weighted ELM  Problem 2.
An informal description of artificial neural networks John MacCormick.
Gap filling of eddy fluxes with artificial neural networks
Applying Neural Networks Michael J. Watts
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
An Introduction To The Backpropagation Algorithm.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Big data classification using neural network
Handwritten Digit Recognition Using Stacked Autoencoders
Artificial Neural Network
Artificial neural networks
Learning with Perceptrons and Neural Networks
Data Mining, Neural Network and Genetic Programming
Many slides and slide ideas thanks to Marc'Aurelio Ranzato and Michael Nielson.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Predict House Sales Price
Final Year Project Presentation --- Magic Paint Face
NEURAL NETWORK APPROACHES FOR AUTOMOBILE MPG PREDICTION
CSSE463: Image Recognition Day 17
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Intelligence Methods
General Aspects of Learning
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Chapter 5
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Ch4: Backpropagation (BP)
Artificial Neural Networks
Artificial Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Application of artificial neural network in materials research
Ch4: Backpropagation (BP)
Example of a simple deep network architecture.
Presentation transcript:

Principle Components & Neural Networks How I finished second in Mapping Dark Matter Challenge Sergey Yurgenson, Harvard University Pasadena, 2011

To measure the ellipticity of 60,000 simulated galaxies Kitching, 2011 Scientific view.

e1= e2= Training set 40,000 training examples Test set 60,000 examples e1= ? e2= ? g: P -> e Regression function g does not need to be justified in any scientific way! Supervised learning is used to find g Data mining view P P e e

Neural Network => e1= e2= RMSE= Too many inputs parameters. Many parameters are nothing more than noise. Slow training Result is not very good Reduce number of parameters Make parameters “more meaningful” Matlab

Principle components to reduce number of input parameters Neural Network with PC as inputs : RMSE~0.0155

Implicit use of additional information about data set: 2D matrixes are images of objects Objects have meaningful center. Calculate center of mass with threshold. Center pictures using spline interpolation. Recalculate principle components Fine dune center position using amplitude of antisymmetrical components Original Centered

Principle Components after center recalculation

Principle components - stars

Components # 2 and # 3 Linear regression using only components 2,3 => RMSE~0.02 Color – 2theta Color – (a-b)/(a+b) e1=[(a-b)/(a+b)]cos(2theta) e2=[(a-b)/(a+b)]sin(2theta)

Neural Network: 38 (galaxies PC) + 8 (stars PC) inputs 2 Hidden Layers -12 neurons (linear transfer function) and 8 neurons(sigmoid transfer function) 2 outputs – e1 and e2 as targets 80% random training subset, 20% validation subset Multiple trainings with numerous networks achieving training RMSE<0.015 Typical test RMSE = – Small score improvement by combining prediction of many networks (simple mean): Combination of multiple networks, training RMSE ~ public RMSE ~ private RMSE ~ Benefit of network combination is ~ Best submission – mean of 35 NN predictions

Training set Test set std= std=

Training RMSETest RMSE Original bit resolution bit resolution bit resolution bit resolution Pix size noise noise noise

Questions: Method is strongly data depended. How method will perform for more diverse data set and real data ? Is there a place for this kind of methods in cosmology?