Knowledge (Jul 2024)

sBERT: Parameter-Efficient Transformer-Based Deep Learning Model for Scientific Literature Classification

  • Mohammad Munzir Ahanger,
  • Mohd Arif Wani,
  • Vasile Palade

DOI
https://doi.org/10.3390/knowledge4030022
Journal volume & issue
Vol. 4, no. 3
pp. 397 – 421

Abstract

Read online

This paper introduces a parameter-efficient transformer-based model designed for scientific literature classification. By optimizing the transformer architecture, the proposed model significantly reduces memory usage, training time, inference time, and the carbon footprint associated with large language models. The proposed approach is evaluated against various deep learning models and demonstrates superior performance in classifying scientific literature. Comprehensive experiments conducted on datasets from Web of Science, ArXiv, Nature, Springer, and Wiley reveal that the proposed model’s multi-headed attention mechanism and enhanced embeddings contribute to its high accuracy and efficiency, making it a robust solution for text classification tasks.

Keywords