Symmetric Weighted Matching for Indefinite Systems Iain Duff, RAL and CERFACS John Gilbert, MIT and UC Santa Barbara June 21, 2002.

Slides:



Advertisements
Similar presentations
1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Fill Reduction Algorithm Using Diagonal Markowitz Scheme with Local Symmetrization Patrick Amestoy ENSEEIHT-IRIT, France Xiaoye S. Li Esmond Ng Lawrence.
Siddharth Choudhary.  Refines a visual reconstruction to produce jointly optimal 3D structure and viewing parameters  ‘bundle’ refers to the bundle.
MATH 685/ CSI 700/ OR 682 Lecture Notes
Solving Linear Systems (Numerical Recipes, Chap 2)
Sparse Matrices in Matlab John R. Gilbert Xerox Palo Alto Research Center with Cleve Moler (MathWorks) and Rob Schreiber (HP Labs)
MATRICES. EXAMPLES:
Modern iterative methods For basic iterative methods, converge linearly Modern iterative methods, converge faster –Krylov subspace method Steepest descent.
Solution of linear system of equations
Lecture 11 - LU Decomposition
Symmetric Minimum Priority Ordering for Sparse Unsymmetric Factorization Patrick Amestoy ENSEEIHT-IRIT (Toulouse) Sherry Li LBNL/NERSC (Berkeley) Esmond.
Approximating Maximum Edge Coloring in Multigraphs
1cs542g-term Notes  Assignment 1 is out (questions?)
1cs542g-term Notes  Assignment 1 is out (due October 5)  Matrix storage: usually column-major.
Lecture 19 Quadratic Shapes and Symmetric Positive Definite Matrices Shang-Hua Teng.
Sparse Matrix Methods Day 1: Overview Day 2: Direct methods
The Landscape of Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage (if sparse) More Robust.
CS 240A: Solving Ax = b in parallel °Dense A: Gaussian elimination with partial pivoting Same flavor as matrix * matrix, but more complicated °Sparse A:
Sparse Matrix Methods Day 1: Overview Day 2: Direct methods Nonsymmetric systems Graph theoretic tools Sparse LU with partial pivoting Supernodal factorization.
Sparse Matrix Methods Day 1: Overview Matlab and examples Data structures Ax=b Sparse matrices and graphs Fill-reducing matrix permutations Matching and.
ECIV 520 Structural Analysis II Review of Matrix Algebra.
Ordinary least squares regression (OLS)
CS240A: Conjugate Gradients and the Model Problem.
Mujahed AlDhaifallah (Term 342) Read Chapter 9 of the textbook
The Evolution of a Sparse Partial Pivoting Algorithm John R. Gilbert with: Tim Davis, Jim Demmel, Stan Eisenstat, Laura Grigori, Stefan Larimore, Sherry.
Chapter 3 The Inverse. 3.1 Introduction Definition 1: The inverse of an n  n matrix A is an n  n matrix B having the property that AB = BA = I B is.
Homework solution Problem 2. The number of odd degree vertices in a graph is even. (recom. book: G. Harary: Graph Theory) Solution: Let G=(V,E,w) S= 
Support-Graph Preconditioning John R. Gilbert MIT and U. C. Santa Barbara coauthors: Marshall Bern, Bruce Hendrickson, Nhat Nguyen, Sivan Toledo authors.
Complexity of direct methods n 1/2 n 1/3 2D3D Space (fill): O(n log n)O(n 4/3 ) Time (flops): O(n 3/2 )O(n 2 ) Time and space to solve any problem on any.
Introduction to Numerical Analysis I MATH/CMPSC 455 PA=LU.
Symbolic sparse Gaussian elimination: A = LU
Design Techniques for Approximation Algorithms and Approximation Classes.
Lecture 5 Parallel Sparse Factorization, Triangular Solution
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Lecture 16 Maximum Matching. Incremental Method Transform from a feasible solution to another feasible solution to increase (or decrease) the value of.
Types of Triangles And Angle Sum Theorems.  Notation for sides.  AB CB AC  Angles   ABC or  B  Vertex angle  Base angle  Opposite side  Opposite.
Chapter 2 System of Linear Equations Sensitivity and Conditioning (2.3) Solving Linear Systems (2.4) January 19, 2010.
1 Incorporating Iterative Refinement with Sparse Cholesky April 2007 Doron Pearl.
Data Structures & Algorithms Graphs
Chapter 5: Permutation Groups  Definitions and Notations  Cycle Notation  Properties of Permutations.
1/24 Introduction to Graphs. 2/24 Graph Definition Graph : consists of vertices and edges. Each edge must start and end at a vertex. Graph G = (V, E)
CS240A: Conjugate Gradients and the Model Problem.
CS 290H Administrivia: May 14, 2008 Course project progress reports due next Wed 21 May. Reading in Saad (second edition): Sections
CS 290H 31 October and 2 November Support graph preconditioners Final projects: Read and present two related papers on a topic not covered in class Or,
SECTION 9 Orbits, Cycles, and the Alternating Groups Given a set A, a relation in A is defined by : For a, b  A, let a  b if and only if b =  n (a)
CS 290H Lecture 15 GESP concluded Final presentations for survey projects next Tue and Thu 20-minute talk with at least 5 min for questions and discussion.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
CS 290H Lecture 9 Left-looking LU with partial pivoting Read “A supernodal approach to sparse partial pivoting” (course reader #4), sections 1 through.
Symmetric-pattern multifrontal factorization T(A) G(A)
Conjugate gradient iteration One matrix-vector multiplication per iteration Two vector dot products per iteration Four n-vectors of working storage x 0.
The Landscape of Sparse Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage More Robust More.
Power Systems Network and Clique Decomposition
Solving Linear Systems Ax=b
CS 290H Administrivia: April 16, 2008
The Landscape of Sparse Ax=b Solvers
MATLAB EXAMPLES Matrix Solution Methods
CS5321 Numerical Optimization
Great Ideas in Computing Average Case Analysis
Problem Solving 4.
RECORD. RECORD COLLABORATE: Discuss: Is the statement below correct? Try a 2x2 example.
CS5321 Numerical Optimization
Section 2.3 – Analyzing Graphs of Functions
Section 4.4 – Analyzing Graphs of Functions
5.4 T-joins and Postman Problems
Read GLN sections 6.1 through 6.4.
Administrivia: November 9, 2009
Nonsymmetric Gaussian elimination
Linear Algebra Lecture 28.
Lecture 24 Vertex Cover and Hamiltonian Cycle
Presentation transcript:

Symmetric Weighted Matching for Indefinite Systems Iain Duff, RAL and CERFACS John Gilbert, MIT and UC Santa Barbara June 21, 2002

Symmetric indefinite systems: Block LDL T Optimization, interior eigenvalues,... Factor A = L * D * L’ L is unit lower triangular D is block diagonal with small symmetric blocks

Symmetric indefinite systems: Block LDL T Optimization, interior eigenvalues,... Factor A = L * D * L’ L is unit lower triangular D is block diagonal with small symmetric blocks Static pivoting after preprocessing by weighted matching

TRAJ06B optimal control (Boeing) condest = 1.4e6 inertia = (794, 0, 871)

TRAJ06B permuted optimal control (Boeing) condest = 1.4e6 inertia = (794, 0, 871) by-2 pivots, 77 1-by-1 pivots

TRAJ06B factor, solve, iterative refinement optimal control (Boeing) condest = 1.4e6 inertia = (794, 0, 871) by-2 pivots, 77 1-by-1 pivots factor nz = 29,842 max |L| = 550 iterations = 2 rel residual = 3.7e-14 [AGL] nz = 40,589 GESP nz = 53,512 GEPP nz = 432,202

Symmetric matrix and graph G(A)A Hollow vertex = zero diagonal element

Symmetric matrix, nonsymmetric matching G(A)A Perfect matching in A = disjoint directed cycles that cover every vertex of G

G(A)A Theorem (easy): Any even cycle can be converted to a set of 2-cycles without decreasing the weight of the matching Symmetric matrix, symmetric matching

What about odd cycles? Breaking up odd cycles may decrease weight G(A)A

What about odd cycles? G(A)A Breaking up odd cycles may decrease weight Can break up odd cycles of length more than k and keep weight at least (k+1)/(k+2) times max

What about odd cycles? G(A)A Breaking up odd cycles may decrease weight Can break up odd cycles of length more than k and keep weight at least (k+1)/(k+2) times max But, may need to defer singular pivots

BlockPerm: Static block pivot permutation Find nonsymmetric max weight matching (MC64) Break up long cycles Permute symmetrically to make pivot blocks contiguous Permute blocks symmetrically for low fill (approx minimum degree) Move any singular pivots to end (caused by odd cycles or structural rank deficiency) Factor A = L*D*L’ without row/col interchanges (possibly fixing up bad pivots) Solve, and improve solution by iterative refinement

SAWPATH optimization (RAL) condest = 2e17 inertia = (776, 0, 583) by-2 pivots, by-1 pivots factor nz = 5,988 max |L| = 9e14 iterations = 1 rel residual = 4.4e-9 MA57 nz = 8,355 (best u) GESP nz = 34,060 GEPP nz = 331,344

LASER optimization inertia = (1000, 2, 2000) by-2 pivots, by-1 pivots (2 singular) factor nz = 7,563 max |L| = 1.6 iterations = 0 rel residual = 6.8e-17 GESP nz = 11,001 GEPP nz = 10,565 (rank wrong)

DUFF320 optimization (RAL) inertia = (100, 120, 100) by-2 pivots (3 singular) by-1 pivots (all singular) factor nz = 685 max |L| = 3.4e8 iterations = 2 rel residual = 3.0 MA57 nz = 2,848 GESP nz = 1,583 GEPP nz = 1,500 (no solve)

DUFF320, take 2: Magic permutation optimization (RAL) inertia = (100, 120, 100) by-2 pivots (none singular) by-1 pivots (all singular)

DUFF320, take 2: Magic permutation optimization (RAL) inertia = (100, 120, 100) by-2 pivots (none singular) by-1 pivots (all singular) factor nz = 722 max |L| = 1.0 iterations = 0 rel residual = 0.0

DUFF320, take 3: Identity block optimization (RAL) inertia = (100, 0, 220) by-2 pivots by-1 pivots factor nz = 1907 max |L| = 1 iterations = 1 rel residual = 1.7e-15 GESP nz = 2,929 GEPP nz = 3,554

Questions Judicious dynamic pivoting (e.g. MA57) Preconditioning iterative methods Combinatorial methods to seek good block diagonal directly (high weight, well-conditioned, sparse) Symmetric equilibration / scaling Better fixups for poor static pivots “We’ll hope to get some boundedness from hyperbolic sines” [Burns & Demmel 2002]

Questions Judicious dynamic pivoting (e.g. MA57) Preconditioning iterative methods Combinatorial methods to seek good block diagonal directly (high weight, well-conditioned, sparse) Symmetric equilibration / scaling Better fixups for poor static pivots “We’ll hope to get some boundedness from hyperbolic sines” [Burns & Demmel 2002]

BCSSTK24 structures (Boeing) condest = 6e11 inertia = (0, 0, 3562) no 2-by-2 pivots, by-1 pivots factor nz = 289,191 max |L| = 410 iterations = 2 rel residual = 2.2e-16 AGL nz = 299,000 Cholesky nz = 289,191 GEPP nz = 1,217,475

DUFF320, take 2: Magic permutation optimization (RAL) inertia = (100, 120, 100) by-2 pivots (none singular) by-1 pivots (all singular)

DUFF33 optimization (RAL) inertia = (9, 15, 9) 9 2-by-2 pivots (1 singular) 15 1-by-1 pivots (all singular) factor nz = 62 max |L| = 7.0e7 iterations = 1 rel residual = 3.0 MA57 nz = GESP nz = GEPP nz =

DUFF33, take 2: Magic permutation optimization (RAL) inertia = (9, 15, 9) 9 2-by-2 pivots (none singular) 15 1-by-1 pivots (all singular) factor nz = 51 max |L| = 1.0 iterations = 0 rel residual = 0.0

DUFF33, take 2: L with magic permutation optimization (RAL) inertia = (9, 15, 9) 9 2-by-2 pivots (none singular) 15 1-by-1 pivots (all singular) factor nz = 51 max |L| = 1.0 iterations = 0 rel residual = 0.0

DUFF33, take 2: D with magic permutation optimization (RAL) inertia = (9, 15, 9) 9 2-by-2 pivots (none singular) 15 1-by-1 pivots (all singular) factor nz = 51 max |L| = 1.0 iterations = 0 rel residual = 0.0