Download presentation
Presentation is loading. Please wait.
1
Quantum One
3
Graham-Schmidt Orthogonalization
4
In the last segment, we extended out definitions of spanning sets, linearly independent sets, and basis sets to allow an application of these concepts to continuously indexed sets of vectors. We then introduced the idea of an inner product, which extends to complex vectors spaces the familiar dot product encountered in real vector spaces. This allowed us to define the norm or length of a vector, to define unit vectors, and to introduce a limited notion of direction through the concept of orthogonality. These notions of length, and orthogonality, allowed us to define orthonormal sets of vectors, with either discrete or continuous indices, and to end up with the idea of an orthonormal basis, i.e., an orthonormal set of vectors that span the space.
5
In the last segment, we extended out definitions of spanning sets, linearly independent sets, and basis sets to allow an application of these concepts to continuously indexed sets of vectors. We then introduced the idea of an inner product, which extends to complex vectors spaces the familiar dot product encountered in real vector spaces. This allowed us to define the norm or length of a vector, to define unit vectors, and to introduce a limited notion of direction through the concept of orthogonality. These notions of length, and orthogonality, allowed us to define orthonormal sets of vectors, with either discrete or continuous indices, and to end up with the idea of an orthonormal basis, i.e., an orthonormal set of vectors that span the space.
6
In the last segment, we extended out definitions of spanning sets, linearly independent sets, and basis sets to allow an application of these concepts to continuously indexed sets of vectors. We then introduced the idea of an inner product, which extends to complex vectors spaces the familiar dot product encountered in real vector spaces. This allowed us to define the norm or length of a vector, to define unit vectors, and to introduce a limited notion of direction through the concept of orthogonality. These notions of length, and orthogonality, allowed us to define orthonormal sets of vectors, with either discrete or continuous indices, and to end up with the idea of an orthonormal basis, i.e., an orthonormal set of vectors that span the space.
7
In the last segment, we extended out definitions of spanning sets, linearly independent sets, and basis sets to allow an application of these concepts to continuously indexed sets of vectors. We then introduced the idea of an inner product, which extends to complex vectors spaces the familiar dot product encountered in real vector spaces. This allowed us to define the norm or length of a vector, to define unit vectors, and to introduce a limited notion of direction through the concept of orthogonality. These notions of length, and orthogonality, allowed us to define orthonormal sets of vectors, with either discrete or continuous indices, and to end up with the idea of an orthonormal basis, i.e., an orthonormal set of vectors that span the space.
8
In this segment we begin by showing that it is always possible to construct an orthonormal basis set from any set of basis vectors of finite length. The explicit algorithm for doing so, referred to as the Gram-Schmidt orthogonalization procedure, is presented below.
9
In this segment we begin by showing that it is always possible to construct an orthonormal basis set from any set of basis vectors of finite length. The explicit algorithm for doing so, referred to as the Gram-Schmidt orthogonalization procedure, is presented below.
10
Let be a set of N linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as |χ₁〉. Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector: 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
11
Let be a set of N linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as |χ₁〉. Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector: 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
12
Let be a set of N linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as |χ₁〉. Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector: 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
13
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector: 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
14
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector, i.e. 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
15
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector, i.e. 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
16
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector, i.e. 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
17
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector, i.e. 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
18
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector, i.e. 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
19
Let be a set of linearly independent vectors of finite length.
This produces a unit vector |φ₁〉 pointing along the same direction as Now construct a second vector orthogonal to the first, by subtracting off that part of it which lies along the direction of the first vector, i.e. 2. Set Note, that by construction so |ψ₂〉 and thus |φ₂〉 are orthogonal to |φ₁〉.
20
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
21
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
22
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
23
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
24
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
25
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
26
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
27
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
28
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
29
We now proceed in this fashion, constructing each new vector orthogonal to each of those previously constructed. That is we 3. Set and, at the nth step Set so that
30
The only way this process could stop is if one of the resulting vectors turned out to be the null vector. A close inspection of the process reveals that this can't happen if the original set is linearly independent, as we have assumed. Thus, in this way we can construct an orthonormal set of vectors equal in number to those of the original set. It follows, that given any basis for the space we can construct an orthonormal basis with an equal number of vectors. In the next lecture, we figure out why that’s a very good thing.
31
The only way this process could stop is if one of the resulting vectors turned out to be the null vector. A close inspection of the process reveals that this can't happen if the original set is linearly independent, as we have assumed. Thus, in this way we can construct an orthonormal set of vectors equal in number to those of the original set. It follows, that given any basis for the space we can construct an orthonormal basis with an equal number of vectors. In the next lecture, we figure out why that’s a very good thing.
32
The only way this process could stop is if one of the resulting vectors turned out to be the null vector. A close inspection of the process reveals that this can't happen if the original set is linearly independent, as we have assumed. Thus, in this way we can construct an orthonormal set of vectors equal in number to those of the original set. It follows, that given any basis for the space we can construct an orthonormal basis with an equal number of vectors. In the next lecture, we figure out why that’s a very good thing.
33
The only way this process could stop is if one of the resulting vectors turned out to be the null vector. A close inspection of the process reveals that this can't happen if the original set is linearly independent, as we have assumed. Thus, in this way we can construct an orthonormal set of vectors equal in number to those of the original set. It follows, that given any basis for the space we can construct an orthonormal basis with an equal number of vectors. We now explore how orthonormal bases make our lives easier.
35
Expansion of a Vector on an Orthonormal Basis
36
Discrete Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that
and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for a unique set of expansion coefficients Q: How do we determine what these expansion coefficients are?
37
Discrete Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that
and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for a unique set of expansion coefficients Q: How do we determine what these expansion coefficients are?
38
Discrete Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that
and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for a unique set of expansion coefficients Q: How do we determine what these expansion coefficients are?
39
Discrete Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that
and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for a unique set of expansion coefficients Q: How do we determine what these expansion coefficients are?
40
Consider the inner product of an arbitrary element of this basis with the vector :
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector
41
Consider the inner product of an arbitrary element of this basis with the vector :
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector
42
Consider the inner product of an arbitrary element of this basis with the vector :
This shows that the expansion coefficient is just the inner product of the unit vector along that direction in Hilbert space with the vector we are expanding. Thus, expansion coefficient = inner product with basis vector
43
Consider the inner product of an arbitrary element of this basis with the vector :
This shows that the expansion coefficient is just the inner product of the unit vector along that direction in Hilbert space with the vector we are expanding. Thus, expansion coefficient = inner product with basis vector
44
Consider the inner product of an arbitrary element of this basis with the vector :
This shows that the expansion coefficient is just the inner product of the unit vector along that direction in Hilbert space with the vector we are expanding. Thus, expansion coefficient = inner product with basis vector
45
Consider the inner product of the vector |χ〉 with an arbitrary element of this basis:
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector (with the basis vector on the left in the inner product …)
46
Consider the inner product of the vector |χ〉 with an arbitrary element of this basis:
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space.
47
Consider the inner product of the vector |χ〉 with an arbitrary element of this basis:
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, We can then write the expansion as
48
Consider the inner product of the vector |χ〉 with an arbitrary element of this basis:
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, We can then write the expansion as
49
Consider the inner product of the vector |χ〉 with an arbitrary element of this basis:
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, We can then write the expansion as
50
Extension to Continuous Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for some unique expansion function Q: How do we determine what this expansion function is?
51
Extension to Continuous Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for some unique expansion function Q: How do we determine what this expansion function is?
52
Extension to Continuous Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for some unique expansion function Q: How do we determine what this expansion function is?
53
Extension to Continuous Bases - Let the set form an orthonormal basis (or ONB) for the space S, so that and let |χ〉 be an arbitrary element of the space. By assumption there exists an expansion of the form for some unique expansion function Q: How do we determine what this expansion function is?
54
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector
55
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector
56
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector
57
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the vector we are expanding with the unit vector along that direction in Hilbert space. Thus, expansion coefficient = inner product with basis vector
58
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the the basis vector along that direction in Hilbert space with the vector we are expanding. Thus, expansion coefficient = inner product with basis vector (with the basis vector on the left of the inner product…)
59
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the the basis vector along that direction in Hilbert space with the vector we are expanding. Thus, expansion coefficient = inner product with basis vector (with the basis vector on the left of the inner product…)
60
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the the basis vector along that direction in Hilbert space with the vector we are expanding. Thus, expansion coefficient = inner product with basis vector (with the basis vector on the left of the inner product…)
61
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the the basis vector along that direction in Hilbert space with the vector we are expanding.
62
Consider the inner product of an arbitrary element of this basis with the vector |χ〉
This shows that the expansion coefficient is just the inner product of the the basis vector along that direction in Hilbert space with the vector we are expanding. Thus
63
So: where We will refer to the function χ(α) as the "wave function" representing |χ〉 in the α representation. Note: This expansion can also be written in the suggestive the form
64
So: where We will refer to the function χ(α) as the "wave function" representing |χ〉 in the α representation. Note: This expansion can also be written in the suggestive the form
65
So: where We will refer to the function χ(α) as the "wave function" representing |χ〉 in the α representation. Note: This expansion can also be written in the suggestive the form
66
So: where We will refer to the function χ(α) as the "wave function" representing |χ〉 in the α representation. Note: This expansion can also be written in the suggestive the form
67
So: where We will refer to the function χ(α) as the "wave function" representing |χ〉 in the α representation. Note: This expansion can also be written in the suggestive the form
68
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanded in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way the expansions of an arbitrary ket can be written
69
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanded in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way the expansions of an arbitrary ket can be written
70
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way the expansions of an arbitrary ket can be written
71
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way the expansions of an arbitrary ket can be written
72
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way the expansions of an arbitrary ket can be written
73
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way the expansions of an arbitrary ket can be written
74
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way expansions of an arbitrary ket can be written
75
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way expansions of an arbitrary ket can be written
76
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way expansions of an arbitrary ket can be written
77
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way expansions of an arbitrary ket can be written
78
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way expansions of an arbitrary ket can be written
79
A Notational Simplification:
It is clear that when we talk about ONB's, such as 〉} or , the important information which distinguishes the different basis vectors from one another is the label or index: i or j in the discrete case, α or α′ in the continuous case. The symbols φ just sort of come along for the ride, a historical vestige of when we were expanding in sets of functions. From this point on we will acknowledge this by using an abbreviated notation: and In this way expansions of an arbitrary ket can be written
81
Calculation of Inner Products Using an Orthonormal Basis
83
The Emergence of Numerical Representations
84
Discrete Bases - Let the set form an orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis set Suppose we know these expansion coefficients, and we want to know the inner product of these two vectors.
85
Discrete Bases - Let the set form an orthonormal basis for S, so that
and let and be arbitrary states of S. These states can be expanded in this basis set Suppose we know these expansion coefficients, and we want to know the inner product of these two vectors.
86
Discrete Bases - Let the set form an orthonormal basis for S, so that
and let and be arbitrary states of S. These states can be expanded in this basis set Suppose we know these expansion coefficients, and we want to know the inner product of these two vectors.
87
Discrete Bases - Let the set form an orthonormal basis for S, so that
and let and be arbitrary states of S. These states can be expanded in this basis set Suppose we know these expansion coefficients, and we want to know the inner product of these two vectors.
88
Discrete Bases - Let the set form an orthonormal basis for S, so that
and let and be arbitrary states of S. These states can be expanded in this basis set Suppose we know these expansion coefficients, and we want to know the inner product of these two vectors.
89
Discrete Bases - Let the set form an orthonormal basis for S, so that
and let and be arbitrary states of S. These states can be expanded in this basis set Suppose we know all the expansion coefficients for both vectors, and we want to know the inner product of these two vectors.
90
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
91
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
92
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
93
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
94
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
95
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
96
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in in CN
97
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in CN.
98
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S∗ with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
99
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S∗ with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
100
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S* with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
101
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S* with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
102
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S* with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of any two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
103
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S* with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of any two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
104
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S* with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of any two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
105
Thus we have an important result:
Any discrete orthonormal basis {|i〉 } generates a column-vector/ row-vector representation, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued column vector with components ψ 𝑖 , and each bra 〈ψ| in S* with a complex-valued row vector with components ψ 𝑖 ∗ , in terms of which the inner product of any two states can be written It is important to note that in our formulation, the quantum state vector |ψ〉 is not a column vector, but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
106
Continuous Bases - Let the set form a continuous orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis Suppose we know these expansion functions, and we want to know the inner product of these two vectors.
107
Continuous Bases - Let the set form a continuous orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis Suppose we know these expansion functions, and we want to know the inner product of these two vectors.
108
Continuous Bases - Let the set form a continuous orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis in the form Suppose we know these expansion functions, and we want to know the inner product of these two vectors.
109
Continuous Bases - Let the set form a continuous orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis in the form Suppose we know these expansion functions, and we want to know the inner product of these two vectors.
110
Continuous Bases - Let the set form a continuous orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis in the form Suppose we know these expansion functions, and we want to know the inner product of these two vectors.
111
Continuous Bases - Let the set form a continuous orthonormal basis for S, so that
and let |χ〉 and |ψ〉 be arbitrary states of S. These states can be expanded in this basis in the form Suppose we know both functions, and we want to know the inner product of these two vectors.
112
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
113
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
114
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
115
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
116
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
117
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
118
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces
119
Well, we can express the desired inner product in the form
But So we can write this inner product in the form But this is exactly of the form of the inner product in functional linear vector spaces.
120
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S∗ with a complex-valued wave function ψ∗ (α). We refer to ψ(α) as the wave function for the state |ψ〉 in the α-representation. In this representation the inner product of two states can be computed as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
121
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function ψ∗ (α). We refer to ψ(α) as the wave function for the state |ψ〉 in the α-representation. In this representation the inner product of two states can be computed as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
122
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function We refer to ψ(α) as the wave function for the state |ψ〉 in the α-representation. In this representation the inner product of two states can be computed as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
123
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function We refer to ψ(α) as the wave function for the state |ψ〉 in the -representation. In this representation the inner product of two states can be computed as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
124
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function We refer to ψ(α) as the wave function for the state |ψ〉 in the -representation. In this representation the inner product of two states can be computed using the wave functions as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
125
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function We refer to ψ(α) as the wave function for the state |ψ〉 in the -representation. In this representation the inner product of two states can be computed using the wave functions as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
126
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function We refer to ψ(α) as the wave function for the state |ψ〉 in the -representation. In this representation the inner product of two states can be computed using the wave functions as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
127
Thus we have an important result:
Any continuous orthonormal basis { } induces a wave function representation for the space, i.e., it gives us a natural way of associating each ket |ψ〉 in S with a complex-valued wave function ψ(α) and each bra 〈ψ| in S* with a complex-valued wave function We refer to ψ(α) as the wave function for the state |ψ〉 in the -representation. In this representation the inner product of two states can be computed using the wave functions as It is important to note that in our formulation, the quantum state vector |ψ〉 is not a wave function but it may in this way be represented by or associated with one. In fact, this may be done in an infinite number ways.
128
In the next segment we attempt to make some of these admittedly abstract ideas more concrete, by applying our general formulation of quantum mechanics to the quantum state space of a single quantum mechanical particle. In so doing, we will see how, in a natural and physically meaningful way the “Schrödinger” representation, which associates the quantum state with a wave function , emerges from the formal mathematical structure developed thus far.
129
In the next segment we attempt to make some of these admittedly abstract ideas more concrete, by applying our general formulation of quantum mechanics to the quantum state space of a single quantum mechanical particle. In so doing, we will see how, in a natural and physically meaningful way the “Schrödinger” representation, which associates the quantum state with a wave function , emerges from the formal mathematical structure developed thus far.
130
In the next segment we attempt to make some of these admittedly abstract ideas more concrete, by applying our general formulation of quantum mechanics to the quantum state space of a single quantum mechanical particle. In so doing, we will see how, in a natural and physically meaningful way the “Schrödinger” representation, which associates the quantum state with a wave function , emerges from the formal mathematical structure developed thus far.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.