Journal of Magnesium and Alloys (Aug 2024)

Introducing MagBERT: A language model for magnesium textual data mining and analysis

  • Surjeet Kumar,
  • Russlan Jaafreh,
  • Nirpendra Singh,
  • Kotiba Hamad,
  • Dae Ho Yoon

Journal volume & issue
Vol. 12, no. 8
pp. 3216 – 3228

Abstract

Read online

Magnesium (Mg) based materials hold immense potential for various applications due to their lightweight and high strength-to-weight ratio. However, to fully harness the potential of Mg alloys, structured analytics are essential to gain valuable insights from centuries of accumulated knowledge. Efficient information extraction from the vast corpus of scientific literature is crucial for this purpose. In this work, we introduce MagBERT, a BERT-based language model specifically trained for Mg-based materials. Utilizing a dataset of approximately 370,000 abstracts focused on Mg and its alloys, MagBERT is designed to understand the intricate details and specialized terminology of this domain. Through rigorous evaluation, we demonstrate the effectiveness of MagBERT for information extraction using a fine-tuned named entity recognition (NER) model, named MagNER. This NER model can extract mechanical, microstructural, and processing properties related to Mg alloys. For instance, we have created an Mg alloy dataset that includes properties such as ductility, yield strength, and ultimate tensile strength (UTS), along with standard alloy names. The introduction of MagBERT is a novel advancement in the development of Mg-specific language models, marking a significant milestone in the discovery of Mg alloys and textual information extraction. By making the pre-trained weights of MagBERT publicly accessible, we aim to accelerate research and innovation in the field of Mg-based materials through efficient information extraction and knowledge discovery.

Keywords