Theory of impedance networks: A new formulation F. Y. Wu FYW, J. Phys. A 37 (2004) 6653-6673.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

Chapter 6 Eigenvalues and Eigenvectors
Limitation of Pulse Basis/Delta Testing Discretization: TE-Wave EFIE difficulties lie in the behavior of fields produced by the pulse expansion functions.
2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
Matrices CS485/685 Computer Vision Dr. George Bebis.
1 Chapter 2 Matrices Matrices provide an orderly way of arranging values or functions to enhance the analysis of systems in a systematic manner. Their.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
ORDINARY DIFFERENTIAL EQUATION (ODE) LAPLACE TRANSFORM.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
Chapter 5 Orthogonality.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Linear Equations in Linear Algebra
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Loop and cut set Analysis
Graph Theory, Topological Analysis ECE Graph Theory, Topological Analysis - Terms Topological Analysis: General, systematic, suited for CAD Graph:
Network Graphs and Tellegen’s Theorem The concepts of a graph Cut sets and Kirchhoff’s current laws Loops and Kirchhoff’s voltage laws Tellegen’s Theorem.
CHAPTER 2 MATRICES 2.1 Operations with Matrices Matrix
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
5 5.2 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors THE CHARACTERISTIC EQUATION.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
P1 RJM 18/02/03EG1C2 Engineering Maths: Matrix Algebra Revision 7 (a) Figure Q7-1 shows an electronic circuit with two batteries and three resistors. The.
Theory of electric networks: The two-point resistance and impedance F. Y. Wu Northeastern University Boston, Massachusetts USA.
The Inverse of a Matrix Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
2.5 – Determinants and Multiplicative Inverses of Matrices.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
10.4 Matrix Algebra. 1. Matrix Notation A matrix is an array of numbers. Definition Definition: The Dimension of a matrix is m x n “m by n” where m =
CHAPTER 7 Determinant s. Outline - Permutation - Definition of the Determinant - Properties of Determinants - Evaluation of Determinants by Elementary.
College Algebra Chapter 6 Matrices and Determinants and Applications
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Matrices and vector spaces
The Inverse of a Square Matrix
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Systems of First Order Linear Equations
Some useful linear algebra
CS485/685 Computer Vision Dr. George Bebis
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Chapter 3 Linear Algebra
The Inverse of a Matrix Prepared by Vince Zaccone
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Sec 3.5 Inverses of Matrices
Linear Algebra Lecture 30.
Eigenvalues and Eigenvectors
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Linear Equations in Linear Algebra
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Theory of impedance networks: A new formulation F. Y. Wu FYW, J. Phys. A 37 (2004)

R Resistor network  

Ohm’s law R V I Combination of resistors

=  -Y transformation: (1899) Star-triangle relation: (1944) Ising model = =

 -Y relation (Star-triangle, Yang-Baxter relation) A.E. Kenelly, Elec. World & Eng. 34, 413 (1899)

r1r1 r1r1 r1r1 r1r1 r2r2

r1r1 r1r1 r1r1 r1r1 r2r

r1r1 r1r1 r1r1 r1r1 r2r I I/2 I r1r1 r1r1 r1r1 r1r1 r2r2

1 2 r r r r r r r r r r r r

1 2 r r r r r r r r r r r r I I/3 I 1 2 r r r r r r r r r r r r I/6

Infinite square network I/4 I

V 01 =(I/4+I/4)r I/4 I I

Infinite square network

Problems: Finite networks Tedious to use Y-  relation 1 2 (a) (b) Resistance between (0,0,0) & (3,3,3) on a 5×5×4 network is

I0I Kirchhoff’s law r 01 r 04 r 02 r 03 Generally, in a network of N nodes, Then set Solve for V i

2D grid, all r=1, I(0,0)=I 0, all I(m,n)=0 otherwise I0I0 (0,0) (0,1) (1,1) (1,0) Define Then Laplacian

Harmonic functions Random walks Lattice Green’s function First passage time Related to: Solution to Laplace equation is unique For infinite square net one finds For finite networks, the solution is not straightforward.

General I1I1 I2I2 I3I3 N nodes

Properties of the Laplacian matrix All cofactors are equal and equal to the spanning tree generating function G of the lattice (Kirchhoff). Example c3c3 c1c1 c2c2 G=c 1 c 2 +c 2 c 3 +c 3 c 1

I2I2 I1I1 ININ network Problem: L is singular so it can not be inverted. Day is saved: Kirchhoff’s law says Hence only N-1 equations are independent → no need to invert L

Solve V i for a given I Kirchhoff solution Since only N-1 equations are independent, we can set V N =0 & consider the first N-1 equations! The reduced (N-1)×(N-1) matrix, the tree matrix, now has an inverse and the equation can be solved.

II   We find Writing Where L  is the determinant of the Laplacian with the  -th row & column removed L  = the determinant of the Laplacian with the  -th and  -th rows & columns removed

Example c3c3 c1c1 c2c2 or The evaluation of L  & L  in general is not straightforward!

Spanning Trees: x x x x xx yy y y y y y y x G(1,1) = # of spanning trees Solved by Kirchhoff (1847) Brooks/Smith/Stone/Tutte (1940)

x x yy G(x,y)= + x x x xx x ++ yyyyyy =2xy 2 +2x 2 y N=4

Consider instead Solve V i (  ) for given I i and set  =0 at the end. This can be done by applying the arsenal of linear algebra and deriving at a very simple result for 2-point resistance.

Eigenvectors and eigenvalues of L 0 is an eigenvalue with eigenvector L is Hermitian L has real eigenvalues Eigenvectors are orthonormal

Consider where Let This gives

Let = orthonormal Theorem:

Example r1r1 r1r1 r1r1 r1r1 r2r2

Example: complete graphs N=3 N=2 N=4

1 23 N-1 r rrr N  

If nodes 1 & N are connected with r (periodic boundary condition)

New summation identities New product identity

M×N network N=6 M=5 r s s I N unit matrix s r rr

M, N →∞

Finite lattices Free boundary condition Cylindrical boundary condition Moebius strip boundary condition Klein bottle boundary condition

Klein bottle Moebius strip

Orientable surface Non-orientable surface: Moebius strip

Orientable surface Non-orientable surface: Moebius strip

Free Cylinder

Klein bottle Moebius strip

Klein bottle Moebius strip Free Cylinder Torus on a 5×4 network embedded as shown

Resistance between (0,0,0) and (3,3,3) in a 5×5 × 4 network with free boundary