IEEE Transactions on Neural Systems and Rehabilitation Engineering (Jan 2024)
Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System
Abstract
The objective of this study was to propose a novel strategy for detecting upper-limb motion intentions from mechanical sensor signals using deep and heterogeneous transfer learning techniques. Three sensor types, surface electromyography (sEMG), force-sensitive resistors (FSRs), and inertial measurement units (IMUs), were combined to capture biometric signals during arm-up, hold, and arm-down movements. To distinguish motion intentions, deep learning models were constructed using the CIFAR-ResNet18 and CIFAR-MobileNetV2 architectures. The input features of the source models were sEMG, FSR, and IMU signals. The target model was trained using only FSR and IMU sensor signals. Optimization techniques determined appropriate layer structures and learning rates of each layer for effective transfer learning. The source model on CIFAR-ResNet18 exhibited the highest performance, achieving an accuracy of 95% and an F-1 score of 0.95. The target model with optimization strategies performed comparably to the source model, achieving an accuracy of 93% and an F-1 score of 0.93. The results show that mechanical sensors alone can achieve performance comparable to models including sEMG. The proposed approach can serve as a convenient and precise algorithm for human-robot collaboration in rehabilitation assistant robots.
Keywords