Quantum One.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Matrix Representation
Quantum One: Lecture 9. Graham Schmidt Orthogonalization.
Applied Informatics Štefan BEREŽNÝ
Quantum One: Lecture Ket-Bra Operators, Projection Operators, and Completeness Relations 3.
Quantum One: Lecture 17.
Quantum One: Lecture Postulate II 3 Observables of Quantum Mechanical Systems 4.
Quantum One: Lecture 16.
Quantum One: Lecture Canonical Commutation Relations 3.
New chapter – quantum and stat approach We will start with a quick tour over QM basics in order to refresh our memory. Wave function, as we all know, contains.
Chapter 3 Determinants and Matrices
Dirac Notation and Spectral decomposition Michele Mosca.
Dirac Notation and Spectral decomposition
Quantum One: Lecture 7. The General Formalism of Quantum Mechanics.
Quantum One: Lecture 8. Continuously Indexed Basis Sets.
Quantum One: Lecture Completeness Relations, Matrix Elements, and Hermitian Conjugation 3.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
राघव वर्मा Inner Product Spaces Physical properties of vectors  aka length and angles in case of arrows Lets use the dot product Length of Cosine of the.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Quantum Mechanics(14/2)Taehwang Son Functions as vectors  In order to deal with in more complex problems, we need to introduce linear algebra. Wave function.
1 Chapter 6 – Determinant Outline 6.1 Introduction to Determinants 6.2 Properties of the Determinant 6.3 Geometrical Interpretations of the Determinant;
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Quantum One: Lecture Representation Independent Properties of Linear Operators 3.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Mathematical Tools of Quantum Mechanics
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum One.
Systems of Identical Particles
MTH108 Business Math I Lecture 20.
Mathematical Formulation of the Superposition Principle
Matrices and vector spaces
5 Systems of Linear Equations and Matrices
Postulates of Quantum Mechanics
Chapter 3 Formalism.
Systems of First Order Linear Equations
Quantum One.
Quantum One.
Quantum One.
Quantum One.
Quantum One.
Quantum Two.
Quantum Two.
Quantum One.
Quantum One.
Lecture on Linear Algebra
Quantum One.
Quantum Two.
Quantum One.
Numerical Analysis Lecture 16.
Quantum One.
Quantum One.
Chapter 3 Linear Algebra
Matrices and Matrix Operations
Quantum Two.
Quantum Two.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Presentation transcript:

Quantum One

Hermitian, Anti-Hermitian, and Unitary Operators

In the last segment, we saw how the completeness relation for ONBs allows one to easily obtain representation dependent equations from representation independent ones, and vice versa. We also defined the matrix elements of an operator between different states, and extended the action of operators so that they could act either to the right on kets, or to the left on bras. In the process, we were led to the notion of Hermitian conjugation, which allows us to identify any relation in the space of kets, with the corresponding relation in the space of bras, and vice versa, and we developed some rules for “taking the Hermitian adjoint” of any expression formulated in the Dirac notation. In the this lecture, we use this idea to introduce some important new concepts. We start with the definition of a Hermitian operator.

In the last segment, we saw how the completeness relation for ONBs allows one to easily obtain representation dependent equations from representation independent ones, and vice versa. We also defined the matrix elements of an operator between different states, and extended the action of operators so that they could act either to the right on kets, or to the left on bras. In the process, we were led to the notion of Hermitian conjugation, which allows us to identify any relation in the space of kets, with the corresponding relation in the space of bras, and vice versa, and we developed some rules for “taking the Hermitian adjoint” of any expression formulated in the Dirac notation. In the this lecture, we use this idea to introduce some important new concepts. We start with the definition of a Hermitian operator.

In the last segment, we saw how the completeness relation for ONBs allows one to easily obtain representation dependent equations from representation independent ones, and vice versa. We also defined the matrix elements of an operator between different states, and extended the action of operators so that they could act either to the right on kets, or to the left on bras. In the process, we were led to the notion of Hermitian conjugation, which allows us to identify any relation in the space of kets, with the corresponding relation in the space of bras, and vice versa, and we developed some rules for “taking the Hermitian adjoint” of any expression formulated in the Dirac notation. In the this lecture, we use this idea to introduce some important new concepts. We start with the definition of a Hermitian operator.

In the last segment, we saw how the completeness relation for ONBs allows one to easily obtain representation dependent equations from representation independent ones, and vice versa. We also defined the matrix elements of an operator between different states, and extended the action of operators so that they could act either to the right on kets, or to the left on bras. In the process, we were led to the notion of Hermitian conjugation, which allows us to identify any relation in the space of kets, with the corresponding relation in the space of bras, and vice versa, and we developed some rules for “taking the Hermitian adjoint” of any expression formulated in the Dirac notation. In the this lecture, we use this idea to introduce some important new concepts. We start with the definition of a Hermitian operator.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real. In a related definition, we now introduce the idea of anti-Hermitian operators.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian, since and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian, since and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian, since and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian, since and where A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian, since and where A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian, since and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian, since

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian, since

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian, since

Thus an arbitrary operator can be decomposed into Hermitian and anti-Hermitian parts, very similar to the way that an arbitrary complex number z =x + iy can be decomposed in real and imaginary parts The concept of the Hermitian adjoint also allows us to define a very important class of operators referred to as unitary operators.

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Thus, unitary operators map complete any complete orthonormal basis of states onto another complete orthonormal basis for the same space, and generally preserve vector relationships in state space, the way that orthogonal transformations do in R³.

Ket-Bra and Matrix Representation of Operators

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what we call a ket-bra expansion for this operator in this representation, in which appear the matrix elements of A connecting the basis states |n〉 and |n′〉.

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what is called a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements connecting the basis states of this representation.

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator , therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the ith component of the equation that one obtains through the operation

Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the i-th component of the equation that one obtains through the operation

Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the i-th component of the equation that one obtains through the operation

Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the i-th component of the equation that one obtains through the operation

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix , i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix , i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix , i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix , i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix , i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes But and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes But and so we can write this as which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes But and so we can write this as which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes But and so we can write this as which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector.

Matrix Representation of Operators That is, the expression is equivalent to

Matrix Representation of Operators That is, the expression is equivalent to

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., is equivalent to which can write as where as before, [A] stands for “the matrix representing ”, etc.

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., is equivalent to which can write as where as before, [A] stands for “the matrix representing ”, etc.

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., is equivalent to which can write as where as before, [A] stands for “the matrix representing ”, etc.

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., is equivalent to which can write as where as before, [A] stands for “the matrix representing ”, etc.

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., is equivalent to which can write as where as before, [A] stands for “the matrix representing ”, etc.

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing . We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing . We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing . We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing . We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the same linear operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the same linear operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the same linear operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the same linear operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for linear operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of . Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the same linear operator associated with the transpose of the matrix representing the linear operator in any other representation. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for linear operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schrödinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schrödinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a (i) wave mechanical or (ii) matrix mechanical representation of the space depends simply upon the choice of basis in which one is working i.e., whether it is (i) continuous or (ii) discrete.

In this segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators. . We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators. . We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators. We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators. We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators. We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next segment, we show how expressions and ideas developed for discrete representations can be extended to continuously-indexed bases sets, and how linear operators can be represented by ket-bra expansions in integral form.