Efficient Training of Neural Network Potentials for Chemical and Enzymatic Reactions by Continual Learning

通过持续学习高效训练神经网络势能用于化学和酶促反应

阅读:1

Abstract

Machine learning (ML) methods have emerged as an efficient surrogate for high-level electronic structure theory, offering precision and computational efficiency. However, the vast conformational and chemical space remains challenging when constructing a general force field. Training data sets typically cover only a limited region of this space, resulting in poor extrapolation performance. Traditional strategies must address this problem by training models from scratch using old and new data sets. In addition, model transferability is crucial for general force field construction. Existing ML force fields, designed for closed systems with no external environmental potential, exhibit limited transferability to complex condensed phase systems such as enzymatic reactions, resulting in inferior performance and high memory costs. Our ML/MM model, based on the Taylor expansion of the electrostatic operator, showed high transferability between reactions in several simple solvents. This work extends the strategy to enzymatic reactions to explore the transferability between more complex heterogeneous environments. In addition, we also apply continual learning strategies based on memory data sets to enable autonomous and on-the-fly training on a continuous stream of new data. By combining these two methods, we can efficiently construct a force field that can be applied to chemical reactions in various environmental media.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。