MLFF Nose-Hoover thermostat and Langevin thermostat
Posted: Wed Sep 14, 2022 11:14 am
Hello,
I read from the wiki that during the MLFF training, it is good to use the Langevin thermostat due to its stochastic nature. I was wondering whether you could comment on the use of the Nose-Hoover thermostat for the NVT ensemble during the training, does this leads to a worse model compared to the model trained with the Langevin thermostat?
I also want to ask a question that has not been very clear on the forum yet, how do we decide on the LANGEVIN_GAMMA parameter? It seems that with a large LANGEVIN_GAMMA value, the code will sample a lot of local structure during the training which is obviously not desired, but with a too small value, the model is undertrained and can't be used for future work. If you could also give some comments on this, it would be great.
Thanks a lot in advance.
Best regards,
Xiliang
I read from the wiki that during the MLFF training, it is good to use the Langevin thermostat due to its stochastic nature. I was wondering whether you could comment on the use of the Nose-Hoover thermostat for the NVT ensemble during the training, does this leads to a worse model compared to the model trained with the Langevin thermostat?
I also want to ask a question that has not been very clear on the forum yet, how do we decide on the LANGEVIN_GAMMA parameter? It seems that with a large LANGEVIN_GAMMA value, the code will sample a lot of local structure during the training which is obviously not desired, but with a too small value, the model is undertrained and can't be used for future work. If you could also give some comments on this, it would be great.
Thanks a lot in advance.
Best regards,
Xiliang