IEEE Transactions on Neural Systems and Rehabilitation Engineering (Jan 2023)
A Wearable Computer Vision System With Gimbal Enables Position-, Speed-, and Phase-Independent Terrain Classification for Lower Limb Prostheses
Abstract
Computer vision can provide upcoming walking environment information for lower limb-assisted robots, thereby enabling more accurate and robust decisions for high-level control. However, current computer vision systems in lower extremity devices are still constrained by the disruptions that occur in the interaction between human, machine, and the environment, which hinder optimal performance. In this paper, we propose a gimbal-based terrain classification system that can be adapted to different lower limb movements, different walking speeds, and gait phases. We use a linear active disturbance rejection controller to realize fast response and anti-disturbance control of the gimbal, which allows computer vision to continuously and stably focus on the desired field of view angle during lower limb motion interaction. We also deployed a lightweight MobileNetV2 model in an embedded vision module for real-time and highly accurate inference performance. By using the proposed terrain classification system, it can provide the ability to classify and predict terrain independent of mounting position (thighs and shanks), gait phase, and walking speed. This also makes our system applicable to subjects with different physical conditions (e.g., non-disabled subjects and individuals with transfemoral amputation) without tuning the parameters, which will contribute to the plug-and-play functionality of terrain classification. Finally, our approach is promising to improve the adaptability of lower limb assisted robots in complex terrain, allowing the wearer to walk more safely.
Keywords