Stabilization matrix method (Ridge regression) Morten Nielsen Department of Systems Biology, DTU.

Slides:



Advertisements
Similar presentations
Ordinary Least-Squares
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
EE 690 Design of Embodied Intelligence
Edge Preserving Image Restoration using L1 norm
CENTER FOR BIOLOGICAL SEQUENCE ANALYSISTECHNICAL UNIVERSITY OF DENMARK DTU Sequence information, logos and Hidden Markov Models Morten Nielsen, CBS, BioCentrum,
Gibbs sampling Morten Nielsen, CBS, BioSys, DTU. Class II MHC binding MHC class II binds peptides in the class II antigen presentation pathway Binds peptides.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Stabilization matrix method (Rigde regression) Morten Nielsen Department of Systems Biology, DTU.
COMP 116: Introduction to Scientific Programming Lecture 11: Linear Regression.
Artificial Neural Networks 2 Morten Nielsen BioSys, DTU.
Optimization methods Morten Nielsen Department of Systems Biology, DTU.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Optimization methods Morten Nielsen Department of Systems biology, DTU.
Algorithms in Bioinformatics Morten Nielsen BioSys, DTU.
Biological sequence analysis and information processing by artificial neural networks.
Artificial Neural Networks 2 Morten Nielsen Depertment of Systems Biology, DTU.
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Model Selection Anders Gorm Pedersen Molecular Evolution Group Center for Biological Sequence Analysis Technical.
Linear Discriminant x2x2 x1x1 class 1 class 2 g 1 (x|w 1 )= w 10 + w 11 x 1 + w 12 x 2 e.g.: 1 - 5x 1 + 4x 2 (5,7) (8,7)
Biological sequence analysis and information processing by artificial neural networks Morten Nielsen CBS.
Model Selection and Validation “All models are wrong; some are useful.”  George E. P. Box Some slides were taken from: J. C. Sapll: M ODELING C ONSIDERATIONS.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Biological sequence analysis and information processing by artificial neural networks.
Project list 1.Peptide MHC binding predictions using position specific scoring matrices including pseudo counts and sequences weighting clustering (Hobohm)
Comparison of Regularization Penalties Pt.2 NCSU Statistical Learning Group Will Burton Oct
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Probabilistic modeling and molecular phylogeny Anders Gorm Pedersen Molecular Evolution Group Center for Biological.
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Collaborative Filtering Matrix Factorization Approach
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Algorithms in Bioinformatics Morten Nielsen Department of Systems Biology, DTU.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Feature Selection in Nonlinear Kernel Classification Olvi Mangasarian & Edward Wild University of Wisconsin Madison Workshop on Optimization-Based Data.
Feature Selection in Nonlinear Kernel Classification Olvi Mangasarian Edward Wild University of Wisconsin Madison.
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
Sequence encoding, Cross Validation Morten Nielsen BioSys, DTU
Artificiel Neural Networks 2 Morten Nielsen Department of Systems Biology, DTU IIB-INTECH, UNSAM, Argentina.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
CROSS-VALIDATION AND MODEL SELECTION Many Slides are from: Dr. Thomas Jensen -Expedia.com and Prof. Olga Veksler - CS Learning and Computer Vision.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
Dealing with Sequence redundancy Morten Nielsen Department of Systems Biology, DTU.
Artificiel Neural Networks 2 Morten Nielsen Department of Systems Biology, DTU.
COT6930 Course Project. Outline Gene Selection Sequence Alignment.
Matlab Tutorial for State Space Analysis and System Identification
Machine Learning 5. Parametric Methods.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
CENTER FOR BIOLOGICAL SEQUENCE ANALYSIS Probabilistic modeling and molecular phylogeny Anders Gorm Pedersen Molecular Evolution Group Center for Biological.
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
Optimization methods Morten Nielsen Department of Systems biology, DTU IIB-INTECH, UNSAM, Argentina.
Key Concepts for Sect. 7.1 *A system of equations is two or more equations in two or more variables. *Numerically, a solution to a system of equations.
Prediction of T cell epitopes using artificial neural networks Morten Nielsen, CBS, BioCentrum, DTU.
Performance measures Morten Nielsen, CBS, Department of Systems Biology, DTU.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Regularized Least-Squares and Convex Optimization.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Algebra 2cc Section 2.10 Identify and evaluate polynomials A polynomial function is an expression in the form: f(x) = ax n + bx n-1 + cx n-2 + … dx + e.
Algebra 1 Section 4.3 Graph Linear Equations by using Intercepts A linear equation (the graph is a line) takes the form Ax + By = C. Which are linear equations?
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
How Good is a Model? How much information does AIC give us?
CSE 4705 Artificial Intelligence
Unfolding Problem: A Machine Learning Approach
Collaborative Filtering Matrix Factorization Approach
J.-F. Pâris University of Houston
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Kinematics The Component Method.
Morten Nielsen, CBS, BioSys, DTU
The Bias Variance Tradeoff and Regularization
Softmax Classifier.
Neural networks (1) Traditional multi-layer perceptrons
Presentation transcript:

Stabilization matrix method (Ridge regression) Morten Nielsen Department of Systems Biology, DTU

A prediction method contains a very large set of parameters –A matrix for predicting binding for 9meric peptides has 9x20=180 weights Over fitting is a problem Data driven method training years Temperature

Model over-fitting (early stopping) Evaluate on 600 MHC:peptide binding data PCC=0.89 Stop training

Stabilization matrix method The mathematics y = ax + b 2 parameter model Good description, poor fit y = ax 6 +bx 5 +cx 4 +dx 3 +ex 2 +fx+g 7 parameter model Poor description, good fit

Stabilization matrix method The mathematics y = ax + b 2 parameter model Good description, poor fit y = ax 6 +bx 5 +cx 4 +dx 3 +ex 2 +fx+g 7 parameter model Poor description, good fit

SMM training Evaluate on 600 MHC:peptide binding data L=0: PCC=0.70 L=0.1 PCC = 0.78

Stabilization matrix method. The analytic solution Each peptide is represented as 9*20 number (180) H is a stack of such vectors of 180 values t is the target value (the measured binding) is a parameter introduced to suppress the effect of noise in the experimental data and lower the effect of overfitting

SMM - Stabilization matrix method - the numerical solution I1I1 I2I2 w1w1 w2w2 Linear function o Sum over weights Sum over data points

SMM - Stabilization matrix method I1I1 I2I2 w1w1 w2w2 Linear function o Per target error: Global error: Sum over weights Sum over data points

SMM - Stabilization matrix method Do it yourself I1I1 I2I2 w1w1 w2w2 Linear function o per target

And now you

SMM - Stabilization matrix method I1I1 I2I2 w1w1 w2w2 Linear function o per target

SMM - Stabilization matrix method I1I1 I2I2 w1w1 w2w2 Linear function o

SMM - Stabilization matrix method Monte Carlo I1I1 I2I2 w1w1 w2w2 Linear function o Global: Make random change to weights Calculate change in “global” error Update weights if MC move is accepted Note difference between MC and GD in the use of “global” versus “per target” error

Training/evaluation procedure Define method Select data Deal with data redundancy –In method (sequence weighting) –In data (Hobohm) Deal with over-fitting either –in method (SMM regulation term) or –in training (stop fitting on test set performance) Evaluate method using cross-validation

A small doit tcsh script /home/projects/mniel/ALGO/code/SMM #! /bin/tcsh -f set DATADIR = /home/projects/mniel/ALGO/data/SMM/ foreach a ( A0101 A3002 ) mkdir -p $a cd $a # Here you can type the lambdas to test foreach l ( ) mkdir -p l.$l cd l.$l # Loop over the 5 cross validation configurations foreach n ( ) # Do training smm -l $l../f00$n > mat.$n # Do evaluation pep2score -mat mat.$n../c00$n > c00$n.pred end # Do concatinated evaluation echo $a $l `cat c00?.pred | grep -v "#" | gawk '{print $2,$3}' | xycorr` \ `cat c00?.pred | grep -v "#" | gawk '{print $2,$3}' | gawk 'BEGIN{n+0; e=0.0}{n++; e += ($1-$2)*($1-$2)}END{print e/n}' ` cd.. end cd.. end