Adjoint Orbits, Principal Components, and Neural Nets Some facts about Lie groups and examples 2.Examples of adjoint orbits and a distance measure 3.Descent equations on adjoint orbits 4.Properties of the double bracket equation 5.Smoothed versions of the double bracket equation 6.The principal component extractor 7.The performance of subspace filters 8.Variations on a theme
Where We Are 9: :45 Part 1. Examples and Mathematical Background 10: :15 Coffee break 11:15- 12:30 Part 2. Principal components, Neural Nets, and Automata 12: :30 Lunch 14: :45 Part 3. Precise and Approximate Representation of Numbers 15: :15 Coffee break 16: :30 Part 4. Quantum Computation
The Adjoint Orbit Theory and Some Applications 1. Some facts about Lie groups and examples 2.Examples of adjoint orbits and a distance measure 3.Descent equations on adjoint orbits 4.Properties of the double bracket equation 5.Smoothed versions of the double bracket equation 6.Loops and deck transformations
Some Background
More Mathematics Background
A Little More Mathematics Background
Still More Mathematics Background
The Last for now, Mathematics Background
Getting a Feel for the Normal Metric
Steepest Descent on an Adjoint Orbit
A Descent Equation on an Adjoint Orbit
A Descent Equation with Multiple Equilibria
A Descent Equation with Smoothing Added
The Double Bracket Flow for Analog Computation
Adaptive Subspace Filtering
Let u be a vector of inputs, and let be a diagonal “editing” matrix that selects energy levels that are desirable. An adaptive subspace filter with input u and output y can be realized by implementing the equations Some Equations
Neural Nets as Flows on Grassmann Manifolds
Summary of Part 2 1. We have given some mathematical background necessary to work with flows on adjoint orbits and indicated some applications. 2. We have defined flows that will stabilize at invariant subspaces corresponding to the principal components of a vector process. These flows can be interpreted as flows that learn without a teacher. 3. We have argued that in spite of its limitations, steepest descent is usually the first choice in algorithm design. 4. We have interpreted a basic neural network algorithm as a flow in a Grassmann manifold generated by a steepest descent tracking algorithm.
M. W. Berry et al., “Matrices, Vector Spaces, and Information Retrieval” SIAM Review, vol. 41, No. 2, R. W. Brockett, “Dynamical Systems That Learn Subspaces” in Mathematical System Theory: The Influence of R. E. Kalman, (A.C. Antoulas, ed.) Springer -Verlag, Berlin pp R. W. Brockett “An Estimation Theoretic Basis for the Design of Sorting and Classification Networks,” in Neural Networks, (R. Mammone and Y. Zeevi, eds.) Academic Press, 1991, pp A Few References