Adjoint Orbits, Principal Components, and Neural Nets Some facts about Lie groups and examples 2.Examples of adjoint orbits and a distance measure 3.Descent.

Slides:



Advertisements
Similar presentations
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Advertisements

NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.

ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
November 12, 2013Computer Vision Lecture 12: Texture 1Signature Another popular method of representing shape is called the signature. In order to compute.
1/12 Vision based rock, paper, scissors game Patrik Malm Standa Mikeš József Németh István Vincze.
CSE554Extrinsic DeformationsSlide 1 CSE 554 Lecture 9: Extrinsic Deformations Fall 2012.
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
ECE 8443 – Pattern Recognition ECE 3163 – Signals and Systems Objectives: Review Resources: Wiki: State Variables YMZ: State Variable Technique Wiki: Controllability.
Shape From Light Field meets Robust PCA
ELE Adaptive Signal Processing
Linear Transformations
Symmetric Matrices and Quadratic Forms
1 L-BFGS and Delayed Dynamical Systems Approach for Unconstrained Optimization Xiaohui XIE Supervisor: Dr. Hon Wah TAM.
6. One-Dimensional Continuous Groups 6.1 The Rotation Group SO(2) 6.2 The Generator of SO(2) 6.3 Irreducible Representations of SO(2) 6.4 Invariant Integration.
Networks, Lie Monoids, & Generalized Entropy Metrics Networks, Lie Monoids, & Generalized Entropy Metrics St. Petersburg Russia September 25, 2005 Joseph.
1 L-BFGS and Delayed Dynamical Systems Approach for Unconstrained Optimization Xiaohui XIE Supervisor: Dr. Hon Wah TAM.
A Concept of Environmental Forecasting and Variational Organization of Modeling Technology Vladimir Penenko Institute of Computational Mathematics and.
Artificial Neurons, Neural Networks and Architectures
3D Geometry for Computer Graphics
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Unsupervised Rough Set Classification Using GAs Reporter: Yanan Yean.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
Overview and Mathematics Bjoern Griesbach
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Adaptive Signal Processing
Neural Networks Lecture 8: Two simple learning algorithms
SVD(Singular Value Decomposition) and Its Applications
Dynamical Systems and Computational Mechanisms Belgium Workshop July 1-2, 2002 Roger Brockett Engineering and Applied Sciences Harvard University.
Presented By Wanchen Lu 2/25/2013
Algorithm Taxonomy Thus far we have focused on:
Radial Basis Function Networks
Introduction to Adaptive Digital Filters Algorithms
A Shaft Sensorless Control for PMSM Using Direct Neural Network Adaptive Observer Authors: Guo Qingding Luo Ruifu Wang Limei IEEE IECON 22 nd International.
Non Negative Matrix Factorization
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Texture scale and image segmentation using wavelet filters Stability of the features Through the study of stability of the eigenvectors and the eigenvalues.
TRAJECTORIES IN LIE GROUPS Wayne M. Lawton Dept. of Mathematics, National University of Singapore 2 Science Drive 2, Singapore
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Workshop on Dynamical Systems and Computation 1–1 Dynamical Systems for Extreme Eigenspace Computations Maziar Nikpour UCL Belgium.
Non-Euclidean Example: The Unit Sphere. Differential Geometry Formal mathematical theory Work with small ‘patches’ –the ‘patches’ look Euclidean Do calculus.
Serge Andrianov Theory of Symplectic Formalism for Spin-Orbit Tracking Institute for Nuclear Physics Forschungszentrum Juelich Saint-Petersburg State University,
Precise and Approximate Representation of Numbers 1.The Cartesian-Lagrangian representation of numbers. 2.The homotopic representation of numbers 3.Loops.
Fourier Series - Recap. Interpretation of Fourier series: expansion in an orthonormal basis. Analogy: orthonormal basis in 3-dimensional space:
CCN COMPLEX COMPUTING NETWORKS1 This research has been supported in part by European Commission FP6 IYTE-Wireless Project (Contract No: )
Motivation Thus far we have dealt primarily with the input/output characteristics of linear systems. State variable, or state space, representations describe.
Adaptive Algorithms for PCA PART – II. Oja’s rule is the basic learning rule for PCA and extracts the first principal component Deflation procedure can.
Optimal Component Analysis Optimal Linear Representations of Images for Object Recognition X. Liu, A. Srivastava, and Kyle Gallivan, “Optimal linear representations.
A Convergent Solution to Tensor Subspace Learning.
8.4.2 Quantum process tomography 8.5 Limitations of the quantum operations formalism 量子輪講 2003 年 10 月 16 日 担当:徳本 晋
Supervisor: Nakhmani Arie Semester: Winter 2007 Target Recognition Harmatz Isca.
신경망의 기울기 강하 학습 ー정보기하 이론과 자연기울기를 중심으로ー
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum New way of looking at our world. Classical vs Quantum Typically a student develops an intuition about how the world works using classical mechanics.
Neural Networks 2nd Edition Simon Haykin 柯博昌 Chap 3. Single-Layer Perceptrons.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Statistical Interpretation of Least Squares ASEN.
Geometry of Interaction Models proofs of linear logic as bidirectional computation executed by the flow of data in the proof net.
Recognition of biological cells – development
PHY 745 Group Theory 11-11:50 AM MWF Olin 102 Plan for Lecture 36:
OSE801 Engineering System Identification Spring 2010
Instance Based Learning
Quantum One.
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Singular Value Decomposition SVD
Capabilities of Threshold Neurons
Symmetric Matrices and Quadratic Forms
Computer Vision Lecture 19: Object Recognition III
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Adjoint Orbits, Principal Components, and Neural Nets Some facts about Lie groups and examples 2.Examples of adjoint orbits and a distance measure 3.Descent equations on adjoint orbits 4.Properties of the double bracket equation 5.Smoothed versions of the double bracket equation 6.The principal component extractor 7.The performance of subspace filters 8.Variations on a theme

Where We Are 9: :45 Part 1. Examples and Mathematical Background 10: :15 Coffee break 11:15- 12:30 Part 2. Principal components, Neural Nets, and Automata 12: :30 Lunch 14: :45 Part 3. Precise and Approximate Representation of Numbers 15: :15 Coffee break 16: :30 Part 4. Quantum Computation

The Adjoint Orbit Theory and Some Applications 1. Some facts about Lie groups and examples 2.Examples of adjoint orbits and a distance measure 3.Descent equations on adjoint orbits 4.Properties of the double bracket equation 5.Smoothed versions of the double bracket equation 6.Loops and deck transformations

Some Background

More Mathematics Background

A Little More Mathematics Background

Still More Mathematics Background

The Last for now, Mathematics Background

Getting a Feel for the Normal Metric

Steepest Descent on an Adjoint Orbit

A Descent Equation on an Adjoint Orbit

A Descent Equation with Multiple Equilibria

A Descent Equation with Smoothing Added

The Double Bracket Flow for Analog Computation

Adaptive Subspace Filtering

Let u be a vector of inputs, and let  be a diagonal “editing” matrix that selects energy levels that are desirable. An adaptive subspace filter with input u and output y can be realized by implementing the equations Some Equations

Neural Nets as Flows on Grassmann Manifolds

Summary of Part 2 1. We have given some mathematical background necessary to work with flows on adjoint orbits and indicated some applications. 2. We have defined flows that will stabilize at invariant subspaces corresponding to the principal components of a vector process. These flows can be interpreted as flows that learn without a teacher. 3. We have argued that in spite of its limitations, steepest descent is usually the first choice in algorithm design. 4. We have interpreted a basic neural network algorithm as a flow in a Grassmann manifold generated by a steepest descent tracking algorithm.

M. W. Berry et al., “Matrices, Vector Spaces, and Information Retrieval” SIAM Review, vol. 41, No. 2, R. W. Brockett, “Dynamical Systems That Learn Subspaces” in Mathematical System Theory: The Influence of R. E. Kalman, (A.C. Antoulas, ed.) Springer -Verlag, Berlin pp R. W. Brockett “An Estimation Theoretic Basis for the Design of Sorting and Classification Networks,” in Neural Networks, (R. Mammone and Y. Zeevi, eds.) Academic Press, 1991, pp A Few References