I'm working on a numerical simulation of a 1D quantum system in density matrix formalism and I'm trying to calculate its time evolution by integrating the Liouville-vonNeumann equation. I'm describing the system in a space basis set (i.e., equally spaced Dirac deltas on a grid; the wave function is assumed to go to infinity outside this boundary), using a Hamiltonian matrix whose kinetic operator is derived using Numerov's algorithm (thus should have an error in the order of O(6) the grid spacing). I'm integrating in time by having rho(t+dt) = rho(t) + 1/(ih)*[H, rho]*dt - actually, I'm using the Runge-Kutta algorithm for the integral so it's a bit more complicated and should have an error of O(4) in dt. I normalize rho at every step to have trace(rho) = 1 and every element on the diagonal real. Still, after a few hundred iterations (more or less depending on dt) I get a weird behaviour. The energy suddenly increases (if I don't do the normalization, it actually becomes negative); and suddenly a host of high spatial-frequency off diagonal elements arises. After a while this causes the numbers to become so crazy that the program crashes altogether. Curiously, this phenomenon becomes only more serious if I decrease the spacing dx of the grid. Changing from a simple trapezoidal to a Runge-Kutta method for the integration in dt didn't improve the situation, so I suspect it's not strictly a matter of integration error. Do you have any ideas on how could I fix this? Or on smarter methods to solve the equation? Note: I am doing it this way because my Hamiltonian is potentially time-dependent, so I can't just exponentiate it, but anyway this problem happens also when I use a time-independent one.

Similar questions and discussions