My question is about the accuracy of DFT based methods in characterising hydrogen bonding. Whether accuracy increases as we go up the Jacob's ladder of DFT functionals?
Thank you for your insightful answer. Using coupled cluster would have given me higher accuracy but I am dealing with a large system composed of 200 atoms and I wanted to reduce the computational time.
For H-Bonds GGAs are not a good choice for a very specific reason: H-bonding is prone to self-interaction error because usually, you have an electron-poor HB-donor and an electron-rich acceptor. With BP86 and GGAs, there will be substantial artificial CT.
Thus, it would be a good idea to use at least a hybrid functional. And in general, nowadays DFT should always be combined with some kind of dispersion correction. If you want to keep it simple, PBE0-D4 (or D3 if D4 isn't available) is always a solid choice. If it need to be cheap, you could consider PBEh-3c, which uses a small def2-SV(P) basis and due to its large amount of HFX (42%) is quite resilient when it comes to SIE.
If you can afford it, good old (RI)MP2 is known to perform very well for hydrogen bonding, but then it becomes very important (and expensive) to converge the basis set.
, thanks for your detailed answer! I must admit that my recommendation was based on general principles rather than specific experience.
I fully agree with you, however, I have a problem with your take on dispersion corrections: Skipping the -D always leads to errors in the "rest" of the system, in particular for larger systems. Even if there is a beneficial error compensation. DFT-D does not just describe "dispersion a bit better", but plain uncorrected DFT (in particular with repulsive functionals like BLYP and B3LYP) does not include dispersion at all. For this very reason, I think that in any case a dispersion correction should be included. Even for functionals that claim to include dispersion (M06 family), including da D3(0) improves the performance for GMTKN55 (as well for the H-bonding benchmark in Jan's article you cited above).
Zhaoxi Sun , I mostly agree with the hierarchy of methods you suggested. However, before going down from QZ to TZ level or even further, you should definitely consider using a DFT-based composite method. These are slightly faster (~10 times) than (m)GGA/TZ but more accurate and robust (see attached figure which shows relative timings for typical approaches). These composite methods are the best you can do at a given computational cost since they include a tailor-made gCP, D3/D4 and basis set. The perhaps best one is the very recent r2SCAN-3c (which is better for non-covalent interactions better than most hybrid/QZ approaches but 1000 times faster, it also beats all but one (m)GGA/QZ methods in the GMTKN55) but not yet widely available. B97-3c is also very good in many situations.
It's difficult to exclude the possibility that there exists a better base-functional , but I can certainly say that r2scan is an excellent choice. It's robust, rather simple (as in: not highly parameterized) and very very well compatible with the -D4 Scheme; This is because it provides very good description of mid-range dispersion/electron correlation effects (which are difficult for any semiclassical correction), and in the long-range regime it cuts out very cleanly, letting D4 take over this part without causing any double-counting issues.
The Coinagemetal adsorption examples (Bz on Cu/Ag/Au) in the paper illustrate this very nicely. A little less evident are the fantastic conformational energies, but also for these a good account of mid/long range dispersion is critical.
Do you have any special functionals in mind we should consider?