The web page of Gaussian 16 says it can use NVIDIA K40, K80 and P100 GPUs under Linux. Earlier GPUs do not have the computational capabilities or memory size to run the algorithms in Gaussian 16".
¿Does anybody have experience with this program using other GPU like GTX1070? I mean, a GTX newer than Tesla K40 for instance.
Thanks!