J. ElderPSYC 6256 Principles of Neural Coding ASSIGNMENT 2: REVERSE CORRELATION.

Slides:



Advertisements
Similar presentations
FMRI Methods Lecture 10 – Using natural stimuli. Reductionism Reducing complex things into simpler components Explaining the whole as a sum of its parts.
Advertisements

Statistical perturbation theory for spectral clustering Harrachov, 2007 A. Spence and Z. Stoyanov.
Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
J. ElderPSYC 6256 Principles of Neural Coding ASSIGNMENT 3: HEBBIAN LEARNING.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Computer Vision – Image Representation (Histograms)
3D Reconstruction – Factorization Method Seong-Wook Joo KG-VISA 3/10/2004.
Lecture 6 Ordination Ordination contains a number of techniques to classify data according to predefined standards. The simplest ordination technique is.
Inverse Regression Methods Prasad Naik 7 th Triennial Choice Symposium Wharton, June 16, 2007.
1 Markov Chains Tom Finke. 2 Overview Outline of presentation The Markov chain model –Description and solution of simplest chain –Study of steady state.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Motion Analysis (contd.) Slides are from RPI Registration Class.
Principle Component Analysis What is it? Why use it? –Filter on your data –Gain insight on important processes The PCA Machinery –How to do it –Examples.
數位控制(九).
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
Introduction to Quantum Information Processing Lecture 4 Michele Mosca.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Adaptive Signal Processing
Using Inverse Matrices Solving Systems. You can use the inverse of the coefficient matrix to find the solution. 3x + 2y = 7 4x - 5y = 11 Solve the system.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Group practice in problem design and problem solving
MIMO Multiple Input Multiple Output Communications © Omar Ahmad
-Gaurav Mishra -Pulkit Agrawal. How do neurons work Stimuli  Neurons respond (Excite/Inhibit) ‘Electrical Signals’ called Spikes Spikes encode information.
1 Vector Space Model Rong Jin. 2 Basic Issues in A Retrieval Model How to represent text objects What similarity function should be used? How to refine.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Tag Ranking Present by Jie Xiao Dept. of Computer Science Univ. of Texas at San Antonio.
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
Kernel adaptive filtering Lecture slides for EEL6502 Spring 2011 Sohan Seth.
Scientific Computing Singular Value Decomposition SVD.
Introduction to MATLAB Session 5 Simopekka Vänskä, THL 2010.
Spatial Analysis of Crime Data: A Case Study Mike Tischler Presented by Arnold Boedihardjo.
A Content-Based Approach to Collaborative Filtering Brandon Douthit-Wood CS 470 – Final Presentation.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
STATISTICS AND OPTIMIZATION Dr. Asawer A. Alwasiti.
Assignments CS fall Assignment 1 due Generate the in silico data set of 2sin(1.5x)+ N (0,1) with 100 random values of x between.
Recognition, SVD, and PCA. Recognition Suppose you want to find a face in an imageSuppose you want to find a face in an image One possibility: look for.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
Some Background on Visual Neuroscience.
Market analysis for the S&P500 Giulio Genovese Tuesday, December
Matrix Factorization & Singular Value Decomposition Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Xiaoying Pang Indiana University March. 17 th, 2010 Independent Component Analysis for Beam Measurement.
Lab 9: practice with functions Some tips to make your functions a little more interesting.
Singular Value Decomposition and Numerical Rank. The SVD was established for real square matrices in the 1870’s by Beltrami & Jordan for complex square.
Chapter 13 Discrete Image Transforms
GWAS Data Analysis. L1 PCA Challenge: L1 Projections Hard to Interpret (i.e. Little Data Insight) Solution: 1)Compute PC Directions Using L1 2)Compute.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
1 Bilinear Classifiers for Visual Recognition Computational Vision Lab. University of California Irvine To be presented in NIPS 2009 Hamed Pirsiavash Deva.
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
I. Statistical Methods for Genome-Enabled Prediction of Complex Traits OUTLINE THE CHALLENGES OF PREDICTING COMPLEX TRAITS ORDINARY LEAST SQUARES (OLS)
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Analyzing Redistribution Matrix with Wavelet
From: A new analytical method for characterizing nonlinear visual processes with stimuli of arbitrary distribution: Theory and applications Journal of.
Click on the assignment you wish to complete
School of EECS, Peking University
Introduction to MATLAB
Graphical Techniques.
A principled way to principal components analysis
Graphical Techniques.
Tutorial 8 Table 3.10 on Page 76 shows the scores in the final examination F and the scores in two preliminary examinations P1 and P2 for 22 students in.
Ilan Ben-Bassat Omri Weinstein
SVD, PCA, AND THE NFL By: Andrew Zachary.
Project HW 6 (Due 3/4) points You are given the following dataset to analyze using TDA Mapper a.) What do you expect the output of TDA mapper to.
Simulation And Modeling
MAT 2401 Linear Algebra 7.2 Diagonalization
Inverse Matrices and Systems
Back to equations of geometric transformations
Probability: What are the chances?
Presentation transcript:

J. ElderPSYC 6256 Principles of Neural Coding ASSIGNMENT 2: REVERSE CORRELATION

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 2 Outline  The assignment requires you to  Write code to produce graphs  Make observations from the graphs  Draw conclusions

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 3 Coding  Coding is in MATLAB.  I will provide you with templates that provide you with:  A list of MATLAB functions to use  Comments describing the flow of operations

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 4 Some Coding Tips  It is important that you know how to use the debugger.  Use the MATLAB Help facility.  You should generally never have a loop (or nested loop) that involves more than a few hundred iterations.

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 5 Dataset  We will be using a portion of the Neural Prediction Challenge Dataset  Responses of V1 neurons to natural vision movies in awake behaving macaque  Both neural responses and visual stimuliare provided  Available at  But you can download the files you need from the course website. We will be analyzing a particular neuron (R0221B)

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 6 Submission Details  You will submit a short lab report on your experiments.  For each experiment, the report will include:  The code you developed  The graphs you produced  The observations you made  The conclusions you drew

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 7 Graphs  The graphs you produce should be as similar as possible to mine.  Make sure everything is intelligible!

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 8 Due Date  The report is due Wed Mar 23

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 9 Reverse Correlation  Raw stimulus response cross-correlation:  Now represent the kernel h as an m x T matrix, where  Correction for temporal stimulus bias:  Correction for spatial stimulus bias:  But this doesn’t work, because there are too many coefficients in Q ss to estimate, and too little power in the high frequencies of the stimulus to estimate them.

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 10 Solution: Regularized Inverse  Use SVD decomposition:  Where U and V are orthonormal rotation matrices and S is a diagonal scaling matrix carrying the eigenvalues of Q ss  The eigenvalues represent the power of the autocorrelation in each of the underlying principle directions (eigenvectors).

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 11 Regularized Inverse  Once the SVD decomposition is computed, taking the inverse is easy.  However, this inverse is unreliable, because noisy eigenvalues in S near 0 result in large noisy values in S -1.  To avoid this, only take the largest eigenvalues from S, and set the remaining diagonal elements of S -1 to 0.

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 12 Firing Rates Histogram KDE

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 13 Stimulus-Response Cross-Correlation

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 14 First-Order Temporal Autocorrelation of Stimulus

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 15 STRF Corrected for Temporal Bias of Stimulus

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 16 Unregularized Correction for Spatial Bias of Stimulus

Probability & Bayesian Inference J. ElderPSYC 6256 Principles of Neural Coding 17 Regularized Correction for Spatial Bias of Stimulus