Discover Internet of Things (Apr 2025)
Design and analysis of antenna through machine learning for next-generation IoT system
Abstract
Abstract This paper presents a novel machine learning-driven approach for designing and optimizing multi-band patch antennas tailored for next-generation Internet of Things applications in 5G and 6G wireless communication systems. Recognizing the limitations of traditional antenna design methods, this work leverages the power of ML to enhance antenna performance and design efficiency. We investigate various ML algorithms, including Decision Tree, Random Forest, ANN, KNN, Extra Tree, CatBoost, Gradient Boost, and XGBoost, for predicting antenna characteristics. Notably, the CatBoost algorithm demonstrates superior performance, achieving 77.4% accuracy in predicting antenna return loss. To validate the efficacy of this approach, a multi-band antenna operating across the 3.5–7.8 GHz, 8.5–10.2 GHz, and 11.8–15 GHz frequency bands was fabricated and evaluated. Results demonstrate a good agreement between predicted and measured performance, highlighting the accuracy and efficiency of the ML-driven design methodology. This approach holds significant promise for accelerating the development of high-performance antennas for a wide range of applications, including Wi-Fi, Fixed Wireless Access, Wideband Aeronautical Intranet Communications, IoT devices, Industrial IoT, smart cities, and remote monitoring systems.
Keywords