GUM*02 tutorial session UTSA, San Antonio, Texas Parameter searching in neural models Mike Vanier, Caltech.

Slides:



Advertisements
Similar presentations
Local Search Algorithms
Advertisements

G5BAIM Artificial Intelligence Methods
NEURAL NETWORKS Backpropagation Algorithm
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Least squares CS1114
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Kostas Kontogiannis E&CE
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Engineering Optimization
Genetic algorithms for neural networks An introduction.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Optimization Methods Unconstrained optimization of an objective function F Deterministic, gradient-based methods Running a PDE: will cover later in course.
Intelligent Agents What is the basic framework we use to construct intelligent programs?
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Optimization Methods One-Dimensional Unconstrained Optimization
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
WAM-BAMM '05 Parameter Searching in Neural Models Michael Vanier California Institute of Technology.
D Goforth - COSC 4117, fall Note to 4 th year students  students interested in doing masters degree and those who intend to apply for OGS/NSERC.
Genetic Algorithm.
Efficient Model Selection for Support Vector Machines
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Artificial Neural Networks
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
(Particle Swarm Optimisation)
Unsupervised Learning: Clustering Some material adapted from slides by Andrew Moore, CMU. Visit for
Simulated Annealing.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Automated Patch Generation Adapted from Tevfik Bultan’s Lecture.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Automated discovery in math Machine learning techniques (GP, ILP, etc.) have been successfully applied in science Machine learning techniques (GP, ILP,
Written by Changhyun, SON Chapter 5. Introduction to Design Optimization - 1 PART II Design Optimization.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Tutorial 2, Part 2: Calibration of a damped oscillator.
Meta-controlled Boltzmann Machine toward Accelerating the Computation Tran Duc Minh (*), Junzo Watada (**) (*) Institute Of Information Technology-Viet.
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
Warehouse Lending Optimization Paul Parker (2016).
Neural Networks - Berrin Yanıkoğlu1 MLP & Backpropagation Issues.
Optimization via Search
CSCI 4310 Lecture 10: Local Search Algorithms
Heuristic Optimization Methods
Local Search Algorithms
More on Search: A* and Optimization
Boltzmann Machine (BM) (§6.4)
Local Search Algorithms
Local Search Algorithms
Stochastic Methods.
Presentation transcript:

GUM*02 tutorial session UTSA, San Antonio, Texas Parameter searching in neural models Mike Vanier, Caltech

The problem: you want to build a model of a neuron you have a body of data you know a lot about the neuron’s morphology physiology ion channel kinetics but you don’t know everything!

Typical preliminary data set anatomy rough idea of morphology detailed reconstruction

Typical preliminary data set physiology current clamp synaptic potentials potentiation modulators

Typical preliminary data set ion channels identities of the major types kinetics modulation

Missing data? ion channels identities of ALL channels densities (uS/(um) 2 ) detailed kinetics anatomy detailed reconstructions? variability? physiology voltage clamp, neuromodulators, etc. ???

Harsh reality most experiments not done with models in mind >half of model parameters loosely constrained or unconstrained experiments to collect model params are not very sexy

A different approach collect data set model should match collect plausible parameters those known to be correct educated guesses build model test model performance modify parameters until get match

How to modify parameters? manually 10 5 values each: possible simulations 1 sim/minute = 19 years! use previous results to guide searching non-linear interactions? tedious!!!

How to modify parameters? automatically set ranges for each parameter define update algorithm start parameter search go home! check results in a day, week,...

match function need to quantify goodness of fit reduce entire model to one number 0 = perfect match match: spike rates spike times voltage waveform

simple match function inputs: different current levels e.g. 0.05, 0.1, 0.15, 0.2, 0.25, 0.3 nA outputs: spike times

waveform match function inputs: hyperpolarized current levels e.g , -0.1 nA outputs: V m (t)

other match functions some data might be more important to match than the rest adaptation bursting behavior incorporate into more complex match functions

weight early spikes more w ij : weighting params set w i0 < w i1 < w i2 <...

harder match functions bursting purkinje cell, pyramidal cell transitions btw complex behaviors regular spiking bursting

the data set need exceptionally clean data set noise in data set: model will try to replicate it! need wide range of inputs

typical data set for neuron model current clamp over wide range hyperpolarized (passive) depolarized (spiking)

the process (1) build model anatomy channel params from lit match passive data hyperpolarized inputs

the process (2) create match function waveform match for hyperpolarized spike match for depolarized run a couple of simulations check that results aren’t ridiculous get into ballpark of right params

the process (3) choose params to vary channel densities channel kinetics m inf (V), tau(V) curves passive params choose parameter ranges

the process (4) select a param search method conjugate gradient genetic algorithm simulated annealing set meta-params for method

the process (5) run parameter search periodically check best results marvel at your own ingenuity curse at your stupid computer figure out why it did/didn’t work

results (motivation)

parameter search methods different methods have different attributes local or global optima? efficiency? depends on nature of parameter space smooth or ragged?

the shapes of space smooth ragged

genesis param search methods Conjugate gradient-descent (CG) Genetic algorithm (GA) Simulated annealing (SA) Brute Force (BF) Stochastic Search (SS)

conjugate gradient (CG) “The conjugate gradient method is based on the idea that the convergence to the solution could be accelerated if we minimize Q over the hyperplane that contains all previous search directions, instead of minimizing Q over just the line that points down gradient. To determine x i+1 we minimize Q over x 0 + span(p 0,p 1,p 2,...,p i ) where the p k represent previous search directions.”

no, really... take a point in parameter space find the line of steepest descent (gradient) minimize along that line repeat, sort of along conjugate directions only i.e. ignore subspace of previous lines

CG method: good and bad for smooth parameter spaces: guaranteed to find local minimum for ragged parameter spaces: guaranteed to find local minimum ;-) not what we want...

genetic algorithm pick a bunch of random parameter sets a “generation” evaluate each parameter set create new generation copy the most fit sets mutate randomly, cross over repeat until get acceptable results

genetic algorithm (2) amazingly, this often works global optimization method many variations many meta-params mutation rate crossover type (single, double) and rate no guarantees

simulated annealing make noise work for you! noisy version of “simplex algorithm” evaluate points on simplex add noise to result based on “temperature” move simplex through space accordingly gradually decrease temperature to zero

simulated annealing(2) some nice properties: guaranteed to find global optimum but may take forever ;-) when temp = 0, finds local minimum how fast to decrease temperature?

comparing methods (1)

comparing methods (2)

comparing methods (3)

recommendations Passive models: SA, CG Small active models: SA Large active models: SA, GA Network models: usually SOL

genesis tutorial (1) objects: paramtableGA paramtableSA paramtableCG task: parameterize simple one-compt neuron Na, K dr, K M channels

genesis tutorial (2) parameters: g max of Na, K dr, K M K M  (v) scaling K M m inf (v) midpoint

Conclusions param search algorithms are useful but: pitfalls, judgment modeler must help computer failure is not always bad! will continue to be active research area