Papers
arxiv:2308.10752

Comprehensive Molecular Representation from Equivariant Transformer

Published on Aug 21, 2023
Authors:
,

Abstract

Machine-learned force fields using equivariant transformers with self-attention mechanisms demonstrate improved molecular simulation accuracy and extrapolation capabilities through optimized attention parameters and initialization methods.

AI-generated summary

The tradeoff between precision and performance in molecular simulations can nowadays be addressed by machine-learned force fields (MLFF), which combine ab initio accuracy with force field numerical efficiency. Different from conventional force fields however, incorporating relevant electronic degrees of freedom into MLFFs becomes important. Here, we implement an equivariant transformer that embeds molecular net charge and spin state without additional neural network parameters. The model trained on a singlet/triplet non-correlated CH2 dataset can identify different spin states and shows state-of-the-art extrapolation capability. Therein, self-attention sensibly captures non-local effects, which, as we show, can be finely tuned over the network hyper-parameters. We indeed found that Softmax activation functions utilised in the self-attention mechanism of graph networks outperformed ReLU-like functions in prediction accuracy. Increasing the attention temperature from τ= d to 2d further improved the extrapolation capability, indicating a weighty role of nonlocality. Additionally, a weight initialisation method was purposed that sensibly accelerated the training process.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2308.10752 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2308.10752 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2308.10752 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.