I would like to get the enthalpy as a function of temperature for BCC lithium at zero pressure.

I have performed a series of NVT simulations with 500 atoms using a Nose-Hoover thermostat at the corresponding equilibrium volumes (found using the volume average of NPT simulations) and calculated the enthalpy as 𝐻=𝑈+𝑝𝑉 which at zero pressure is just the total energy in the simulation. When I compare the result with experimental values from NIST referenced to the enthalpy at 0K, the enthalpy I get is significantly higher.

Things I've thought about:

  • It is not an offset so it's not like a constant contribution like zero point energy is missing and besides the referencing should fix that.
  • It is not a constant factor difference either and I think my units are fine.
  • The pressure is indeed 0 and fluctuates by about 0.005GPa which is tiny i.e. pV term fluctuation is less than 1meV/atom
  • The simulation is stable, it remains BCC the entire team as seen from Common Neighbor Analysis and the eye test.
  • My questions are:

  • Am I thinking about this wrong? Is there some reason why this is not a valid simulation protocol for getting the enthalpy of a solid? Perhaps a classical simulation near 0K is not valid since quantum effects dominate?
  • Am I missing some term? It would have to be a decreasing function of temperature and any other contribution such as electronic enthalpy (from integrating electronic heat capacity) would make it worse by increasing the enthalpy
  • Is there a paper where someone has computed the enthalpy as a function of pressure of a solid using MD/DFT, ideally near 0K?
  • Similar questions and discussions