Canonical form for matrices and transformations.

Slides:



Advertisements
Similar presentations
Matrix Theory Background
Advertisements

Eigenvalues and Eigenvectors
Similarity Transformation. Basic Sets Use a new basis set for state space. Obtain the state-space matrices for the new basis set. Similarity transformation.
Chapter 6 Eigenvalues.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
5.IV. Jordan Form 5.IV.1. Polynomials of Maps and Matrices 5.IV.2. Jordan Canonical Form.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Finding Eigenvalues and Eigenvectors What is really important?
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Digital Control Systems Vector-Matrix Analysis. Definitions.
5.III. Nilpotence 5.III.1 Self-Composition 5.III.2 Strings Every square matrix is similar to one that is a sum of 2 kinds of simple matrices: diagonal.
Matrices CS485/685 Computer Vision Dr. George Bebis.
EXAMPLE 2 Find all zeros of f (x) = x 5 – 4x 4 + 4x x 2 – 13x – 14. SOLUTION STEP 1 Find the rational zeros of f. Because f is a polynomial function.
LIAL HORNSBY SCHNEIDER
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach Wayne Lawton Department of Mathematics National University of Singapore S ,
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Polynomials Algebra Polynomial ideals
Gram-Schmidt Orthogonalization
Eigenvalues and Eigenvectors
Chapter 2: Vector spaces
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
7.4. Computations of Invariant factors. Let A be nxn matrix with entries in F. Goal: Find a method to compute the invariant factors p 1,…,p r. Suppose.
AN ORTHOGONAL PROJECTION
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 3 Polynomial and Rational Functions Copyright © 2013, 2009, 2005 Pearson Education, Inc.
 Evaluate a polynomial  Direct Substitution  Synthetic Substitution  Polynomial Division  Long Division  Synthetic Division  Remainder Theorem 
Matrices Matrices For grade 1, undergraduate students For grade 1, undergraduate students Made by Department of Math.,Anqing Teachers college.
6.8. The primary decomposition theorem Decompose into elementary parts using the minimal polynomials.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Matrices and linear transformations For grade 1, undergraduate students For grade 1, undergraduate students Made by Department of Math.,Anqing Teachers.
Remainder Theorem If f(x) is divided by x – r, then the remainder is equal to f(r). We can find f(r) using Synthetic Division.
6. Elementary Canonical Forms How to characterize a transformation?
3.3 Polynomial and Synthetic Division. Long Division: Let’s Recall.
Spectrum of A σ(A)=the spectrum of A =the set of all eigenvalues of A.
State space model: linear: or in some text: where: u: input y: output x: state vector A, B, C, D are const matrices.
6.4. Invariant subspaces Decomposing linear maps into smaller pieces.
7.2. Cyclic decomposition and rational forms Cyclic decomposition Generalized Cayley-Hamilton Rational forms.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Pamela Leutwyler. Find the eigenvalues and eigenvectors next.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
College Algebra Chapter 3 Polynomial and Rational Functions Section 3.3 Division of Polynomials and the Remainder and Factor Theorems.
Characteristic Polynomial Hung-yi Lee. Outline Last lecture: Given eigenvalues, we know how to find eigenvectors or eigenspaces Check eigenvalues This.
Systems of Linear Differential Equations
Annihilating polynomials
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
4. The Eigenvalue.
CS479/679 Pattern Recognition Dr. George Bebis
Sullivan Algebra and Trigonometry: Section 5
6.6 Direct sum decompositions
§3-3 realization for multivariable systems
Systems of First Order Linear Equations
Orthogonality and Least Squares
Feedback Control Systems (FCS)
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach
CS485/685 Computer Vision Dr. George Bebis
Numerical Analysis Lecture 16.
Division of Polynomials and the Remainder and Factor Theorems
Digital Control Systems
Chapter 3 Canonical Form and Irreducible Realization of Linear Time-invariant Systems.
Note Please review 組合矩陣理論筆記(4) 1-3 about
2.5 The Real Zeros of a Polynomial Function
Back to Cone Motivation: From the proof of Affine Minkowski, we can see that if we know generators of a polyhedral cone, they can be used to describe.
Linear Algebra Lecture 41.
CHAPTER 3: Cyclic and convolution codes
Subject :- Applied Mathematics
THE DIMENSION OF A VECTOR SPACE
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Eigenvalues and Eigenvectors
Presentation transcript:

Canonical form for matrices and transformations. 7.3. Jordan form Canonical form for matrices and transformations.

pi=xk_i, -> k=k1  k2….kr1. N nilpotent on Vn. Cyclic decomposition V=Z(a1;N)…  Z(ar;N). p1,….,pr N-annihilators, p i+1|pi , i=1,…,r-1. minpoly N= xk, kn: Nr=0 for some r. Thus xr is in Ann(N). minpoly N divides xr and hence minpolyN=xk for some k. minpolyN divides charpoly N of deg n. pi=xk_i, -> k=k1  k2….kr1.

We now have a rational form for N: Companion matrix of xk_i: kixki-matrix Thus the rational form of N is 1s at one below the diagonal that skips one space after k1-1,k2-1,…, so on.

N is a direct sum of elementary nilpotent matrices A nilpotent nxn-matrix up to similarity <-> positive integers r, k1,…,kr, such that k1+…+kr=n, ki  ki+1 Proof is omitted. r=nullity N. In fact null N=<Nk_i-1ai:i=1,…,r>. ai cyclic vectors. Proof: a in V -> a=f1(N)a1+….+fr(N)ar, fi in F[x], deg fi ki.

Na=0 implies N(fi(N)ai)=0 for each i. xfi(N) ai =0. (N-invariant direct sum property) xfi(N) ai =0. xfi is in N-annihilator of ai xfi is divisible by xk_i. Thus, fi=cixk_i-1, ci in F and a=c1(xk_1-1)(N)a1+….+cr(xk_r-1)(N)ar. Therefore {Nk_1-1a1, …, Nk_r-1ar } is a basis of null N, and dim = r.

Jordan form construction: Let T be a linear operator where charpoly f=(x-c1)d_1….(x-ck)d_k. For example when F=C. minpoly f= (x-c1)r_1….(x-ck)r_k. Let Wi = null (T-ciI)r_i. V=W1…Wr. Let Ti=T|Wi:Wi->Wi. Ni=Ti-ciI is nilpotent. Choose a basis Bi of Wi s.t. Ni is in rational form. Then Ti= Ni+ciI.

[Ti]B_i = There are some gaps of 1s here. If there are no gaps, then it is called the elementary Jordan matrix with char value ci.

A = Jordan form Ai = elementary Jordan matrices The size of elementary Jordan matrices decreases.

Uniquenss of Jordan form: Suppose T is represented by a Jordan matrix. V=W1…Wk. Ai is dixdi-matrix. Ai on Wi. Then charpolyT=(x-c1)d_1….(x-ck)d_k. c1,…,ck, d1,…,dk are determined unique up to order. Wi=null(T-ciI)d_i clearly. Ai is uniquely determined by the rational form for (T-ciI). (Recall rational form is uniquely determined.)

Properties of Jordan matrix: (1) c1,..,ck distinct char.values. ci is repeated di=dimWi=multiplicity in char poly of A. (2) Ai direct sum of ni elementrary Jordan matrices. nidi. (3) Ji1 of Ai is rixri-matrix. ri is multiplicity of the minimal polynomial of T. (So we can read the minimal polynomial from the Jordan form)

Proof of (3): We show that (T-c1I)r_1….(T-ckI)r_k=0. T|Wi=Ai and (Ai-ciI)r_i=0 on Wi. (x-c1)r_1….(x-ck)r_k is the least degree. This can be seen from the largest elementary Jordan matrix J1i. (J1i - ci I)q 0 for q< ri. This completes the proof.