Friday, April 24, 2009

Homework #5, due Friday, 4:30 PM


Added notes: Problem 4 is particularly important; (3 is a prelude to 4). The image to the right is from a comment by Jerome about problem 4. (See comment #14 below.

The matrix problems, 5 and 6, should be computationally easy, yet very intriguing. No one is really expected to do problem 8.

1. For the 1-D harmonic oscillator:
a) Starting from the ground state wave-function, psi(x), use a+ to generate the 1st excited state wave-function and then graph it vs x.

2. Graph:
a) 2(x /a)^2 - 1
b) 2 (x/a)^3 - 3
correction added: that should say 3x instead of "3".
In what regions are each of these functions positive? negative? At what values of x do the sign changes take place?
c) Note that these are related to harmonic oscillator ground states.

3. Consider an electron moving in 2 dimensions in the potential V(x) = (1/2) k [x^2 + y^2].
a) what is the kinetic energy operator?
b) Starting with the ground state find the first three eigenstates* of H, for a particle of mass m in this potential. [Hint: If you can show that the differential equation is separable with regard to x and y, then the solutions (eigenstates) can be constructed via products of 1D HO eigenstates.]
* That is, the 3 eigenstates which have the lowest energies.

4. Using the (standard) definition of 2D angular momentum (z component), Lz= x p_y - y p_x ,
a) show that Lz can be expressed as - i hbar [ ax+ ay - ax ay+]
[hint: I think that x and y raising and lowering operators commute with each other; this is related to the separability of the diff. eq., i believe. Actually, do you need to use this? I am not sure.]

b) Calculate the expectation value of Lz for each of your eigenstates from #3

c) Being careful about commutation, calculate Lz^2 (conceptually similar to p^2, except now we are dealing with angular, rather than linear, momentum.)

d) Calculate the expectation value of L^2 (==Lz^2, in 2D) for each of your 3 states from #3. What do you learn from this? Are these expectation values zero or non-zero? Feel free to interpret speculate, propose new directions for investigation...

e) How many states are in the next "degeneracy manifold"? (That is, if you consider a 4th state, how many states are there that have the same energy and are linearly independent?) What are they? (That is, what is a spanning set for this degeneracy manifold?)

5 a) For the matrix:
| 0 1 0 |
| 1 0 1 |
| 0 1 0 |
Use intuitive methods (i.e., guess a vector and mutiply it by the matrix, ask a friend or a computer (anything NOT involving determinants)) to find the eigenvectors. Show that your putative eigenvectors are indeed eigenvectors by direct multiplication. What are their respective eigenvalues.

5 b) a) Multiply the matrix:
| 0 0 0 |
| 1 0 0 |
| 0 1 0 |
times the vector (1,0,0).
b) Do it again. (I mean with the vector that you got from a). Not the original one.) What do you learn from this?

6. a) Find the eigenvectors, and their respective eigenvalues. for each of the following matrices:
| 1 0 | | 0 1 | | 0 i |
| 0 -1 | | 1 0 | | -i 0 |

b) Express the eigenvectors of the 2nd matrix in the basis of the eigenvectors of the 3rd matrix.
Correction: Well, how about eigenvectors of the 1st matrix in the basis of the eigenvectors of the 2nd matrix. (You can do the 2nd in terms of the 3rd for extra credit, if you want.)

c) What do these matrices have in common? Do you think they like each other?? Why or why not!?
---
7. Consider an electron in a 1D potential which is constant.
a) Show that exp{ikx} is an energy eigenstate with energy ___.
b) Show that it is also a momentum eigenstate with momentum hbar k.
c) Show that cos(kx) is also an eigenstate of one of these but not the other. Which one? Discuss briefly. (What is the relationship between exp{ikx} and cos(kx) ?)
d) For an intitial state, Psi(x,0), that is of Gaussian form (like the HO g.s.), calculate the time dependence of the state function, and of the expectation value of x^2.
e) For this state, show that the expectation value of p is zero, and that
f) the expectation value of p^2 is independent of time.

8) If you are really ambitious*, take the same state, multiply it by exp{ik_0 x} and then recalculate everything. I think you will likely find that the expectation of p is now finite (hbar k_0) and the the expectation value of x is now time dependent, and that you have a wave-packet moving to the right at a speed proportional to k_0/m .
*This might not be fun. Please do problem 9 first. You really don't have to do this problem.

9. Consider the 1d delta-function potential, V(x) = alpha delta(x).
Suppose that you write the part of the state function to the left of delta function as:
exp{ikx} + B exp{-ikx},
and the part of the state function to the right as
C exp{ikx},
and that you think of these terms as representing an incoming wave from the left, a reflected wave (B), and a transmitted wave (C).

a) Use the boundary conditions at the position of the delta function to calculate B and C.
b) Are B and C real or complex? What do they mean? Is there something you could graph as function of alpha that might convey some interesting information regarding them? Discuss.
c) Evaluate |B|^2 + |C|^2.
d) How are your results different for positive and negative alpha?

24 comments:

  1. For the p^2/2m operator in 2-D, do we write p = px+py, and upon squaring, get p^2 = px^2 + pxpy + pypx + py^2, and use this for p^2/2m, or can we ignore the cross terms and write p^2/2m = px^2/2m + py^2/2m?

    ReplyDelete
  2. Never mind, p^2 is the length of the vector, so p^2 = p*p is a dot product, and since px and py are perpendicular, the cross terms disappear...does this argument make sense?

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete
  4. Wait, Mike, p itself is not a vector, but a vector operator. I agree that p^2=p*p, but I don't think that this is equivalent to an inner product, so I don't think the cross terms should cancel. I very well could be wrong. Anyone else have another take on it?

    ReplyDelete
  5. Actually, what I said seems wrong too. Why should p^2 be p*p. Its not a vector, I think it should just be the operator squared.

    ReplyDelete
  6. Maybe we could look at it like this. P is a vector, the components of which are "operators" (meaning derivatives in the representation we care about).

    ReplyDelete
  7. If we look at the 3D time-independent schrodinger eqn, we have (-hbar^2/2m * del^2 + v)Psi = E*Psi; since del^2 has no cross terms (del is a "vector" and an operator, and we dot it with itself to get del^2), it seems that we wouldnt for p^2 either.

    ReplyDelete
  8. For #3, we're going to have an integer for both x and y. Clearly the lowest energy would have nx=0,ny=0. But would nx=1,ny=0 and nx=0,ny=1 be considered separate states? Liboff mentions something about symmetric and antisymmetric states, so maybe that's involved...anybody have any ideas?

    ReplyDelete
  9. For 7d, in order to find the time-dependence of Psi, we need to break Psi(x,0) into a linear combination of eigenstates. Do we use the Exp(ikx) as our eigenstates? If that is the case, since k can be any value, would we have a continuous, rather than discrete, sum for Psi(x,0), and also for Psi(x,t)? (e.g., do we have Psi(x,0) = Integrate[C_k*Exp(ikx),{k,-inf,inf}] and Psi(x,t) = Integrate[C_k*Exp(ikx)*Exp(-iE_k*t/hbar),{k,-inf,inf}], where C_k = Integrate[Psi(x,0)*Exp(ikx),{x,-inf,inf}]?)
    Does this make any sense, or am I going in the wrong direction?

    ReplyDelete
  10. Zack, can we extended the due date to Friday at 5pm? We have to study for E&M midterm on Thursday, as well as your Quiz tomorrow.

    ReplyDelete
  11. How about Friday at 4:30PM? (That 1/2 hour makes getting it to the readers on Friday easier.)

    ReplyDelete
  12. That would be excellent. Thanks!

    ReplyDelete
  13. Thank you!! That is very helpful!

    ReplyDelete
  14. Clarifying your notation on problem 4 part a) do you mean the following?

    L_{z} = [a_{x}^{+}][a_{y}] - [a_{x}][a_{y}^{+}]

    I also posted it here to make sure I'm clear

    http://mathbin.net/10867

    ReplyDelete
  15. I think so. Is that what you got? I posted an image of your equation for Lz (from your link) into the HW post.

    ReplyDelete
  16. Oh. There is a mistake in problem 2. It should say 3x, not 3. (It is supposed to be an odd polynomial.)

    ReplyDelete
  17. Also I have been working on solutions and I have an idea for 6b. Let's change it to "express the eigenvectors of the first matrix in the basis of the eigenvectors of the 2nd matrix." And we can make the present 6b question extra credit.

    ReplyDelete
  18. For number 4 part b I calculated using the raising and lowering operators that all of the Lz expectation values are zero. I have not done the {Lz}^2 expectation values yet but I believe they have values, This sort of similar to earlier homeworks which the expectation values of P is zero but P^2 is not.

    ReplyDelete
  19. Yea Ricardo, got all 0 for L_z as well. For Lz^2 the ground state is still zero, but the second two states are 1.

    For 5b)a), when it says to multiply the vector times the matrix, does it actually mean multiply the vector BY the matrix? If the matrix is on the left of the vector, its (0,0,0), so nothing do "do it again" with...

    ReplyDelete
  20. i am a little unclear about #6. what are we multiplying those matrices by?

    ReplyDelete
  21. do you mean repeat part (a) of #5 for those matrices, or repeat part (b)? I guess it wouldn't make sense to repeat part (b).

    ReplyDelete
  22. for graphing #2, is that a just any constant or is it 1/beta?

    ReplyDelete
  23. i just graphed it using "a" as an arbitrary constant, but im not finding anything interesting about the graphs. any idea?

    ReplyDelete
  24. Regarding 5 and 6: 6a was a complete typo/error. It should have said, find the eig vectors... I changed it. It made no sense to say "do the same thing" for several reasons, one being that you can't multiply a 3-component vector by a 2x2 matrix. Thanks for pointing that out.

    On 5, I intended for the matrix to be on the left and the vector on the right, so the way it was written was misleading or incorrect. I changed it to make that more clear. Multiplying 100 by that matrix, i think you should then get 010, right? Then what do you get when you mutiply the matrix times 010 ?

    ReplyDelete