AIP Advances (May 2024)
SA-GNN: Prediction of material properties using graph neural network based on multi-head self-attention optimization
Abstract
With the development of science and technology and the improvement of hardware computing power, the application of large models in the field of artificial intelligence (AI) has become a current research hotspot Among the focal points in the field of deep learning, AI for science is one of the highlighted areas, utilizing deep learning methods for pattern recognition, anomaly detection, predictive analysis, and more on a large scale of scientific data. In the realm of materials science, the structure of crystals is composed of edges and nodes, making it readily representable as a graph. In previous research, some typical models, such as the MEGNet model, utilized their graph neural network features to fit computational results based on density functional theory for predicting various material properties. Building on this concept, the authors propose a novel graph neural network (GNN) model, optimized with a Multi-Head Self-Attention (MHSA) mechanism, for predicting materials data with crystal structures. This model is named self-attention enhanced graph neural network. The model segments the input data into three parts: edges, nodes, and global features. The graph convolutional layer module is primarily used for aggregating node, edge, and global features, learning node representations, and capturing higher-order neighborhood information through multiple layers of GNN. The MHSA component allows nodes to learn global dependencies, providing different representation subspaces for the nodes. In comparison with other machine learning and deep learning models, the results indicate an improvement in the predictive accuracy of this model. A new graph neural network (GNN) model called Self-Attention Enhanced Graph Neural Network (SA-GNN) is proposed for predicting the properties of materials with crystal structures. This model incorporates multi-head self-attention to allow nodes to learn global dependencies and generate different representational subspaces. Compared to other machine learning and deep learning models, the results show improved predictive accuracy, demonstrating the potential of graph networks combined with self-attention for modeling crystal material data.