CS723 - Probability and Stochastic Processes

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Eigen Decomposition and Singular Value Decomposition
Eigen Decomposition and Singular Value Decomposition
Non-linear DA and Clustering Stat 600. Nonlinear DA We discussed LDA where our discriminant boundary was linear Now, lets consider scenarios where it.
Canonical Correlation
Covariance Matrix Applications
1 8. Numerical methods for reliability computations Objectives Learn how to approximate failure probability using Level I, Level II and Level III methods.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Exam includes sections from Chapter 5 –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33,
Today Today: More Chapter 3 Reading: –Please read Chapter 3 –Suggested Problems: 3.2, 3.9, 3.12, 3.20, 3.23, 3.24, 3R5, 3R9.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Review of Probability.
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: We will primarily focus on discrete time linear.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Lecture 11 Pairs and Vector of Random Variables Last Time Pairs of R.Vs. Marginal PMF (Cont.) Joint PDF Marginal PDF Functions of Two R.Vs Expected Values.
Tim Marks, Dept. of Computer Science and Engineering Random Variables and Random Vectors Tim Marks University of California San Diego.
General ideas to communicate Dynamic model Noise Propagation of uncertainty Covariance matrices Correlations and dependencs.
Statistics. A two-dimensional random variable with a uniform distribution.
Operations on Multiple Random Variables
1 Matrix Algebra and Random Vectors Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Math 4030 – 6a Joint Distributions (Discrete)
Multivariate distributions Suppose we are measuring 2 or more different properties of a system –e.g. rotational and radial velocities of stars in a cluster.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
Expected values, covariance, and correlation
Probability.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
EIGEN … THINGS (values, vectors, spaces … )
CS723 - Probability and Stochastic Processes
Onur DOĞAN.
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Using Matrices with Transformations
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Matrix Algebra and Random Vectors
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
数据的矩阵描述.
Linear Algebra Lecture 32.
Handout Ch 4 實習.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
CS723 - Probability and Stochastic Processes
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
1.11 Use Inverse Matrices to Solve Linear Systems
Eigen Decomposition Based on the slides by Mani Thomas
CS723 - Probability and Stochastic Processes
Canonical Correlation Analysis
CS723 - Probability and Stochastic Processes
Principal Components What matters most?.
Eigenvectors and Eigenvalues
CS723 - Probability and Stochastic Processes
Outline Variance Matrix of Stochastic Variables and Orthogonal Transforms Principle Component Analysis Generalized Eigenvalue Decomposition.
Presentation transcript:

CS723 - Probability and Stochastic Processes

Lecture No. 28

In Previous Lectures Finished transformation of random variables Started discussion of mean and variance of transformed random variables fUV(u,v) can be used to find expected values of functions of U, V, or both fXV(x,y) can be used to find expected values of functions of X, Y, or both fXV(x,y) can be directly used to find expected values of functions of U, V, or both if transforming functions u=g( . , . ) and h( . , .) are known

E[r( u, v)] using fXY( x, y) Expected value of any function of U and V can be found using joint PDF fXY(x,y) and the transforming functions involved

Linear g( . , . ) and h( . , . )

Linear g( . , . ) and h( . , . )

Linear g( . , . ) and h( . , . ) Correlation and Covariance of U and V are:

Covariance Matrices Covariance matrix for uncorrelated X and Y is Then, covariance matrix for U and V is

Covariance Matrix Covariance matrix for U & V can be written as Is there a relationship between Covariance matrices KUV and KXY

Covariance Matrix Covariance matrix for U & V can be written as Uncorrelated X & Y can give correlated U & V

KUV from KXY

KUV from KXY For correlated X and Y Is it possible for U and V to be uncorrelated when X and Y are correlated

Eigen Values & Eigenvectors If Ta = λa, then λ is called an eigen value of T and a is called an eigenvector of T For non-trivial 2x2 matrices, there are two eigen values λ1, λ2 associated with two corresponding eigenvectors a1, a2 Then, Ta1 = λ1a2 and Ta2 = λ2a2 Check before uploaded it’s part of lect 28 or 29