Systems and Soft Computing (Dec 2024)
VR interactive input system based on INS and binocular vision fusion
Abstract
In virtual reality interactive input systems, real-time and accurate spatial position and pose measurement is key to achieving a natural user interface. This study explores the application of inertial measurement units and binocular vision fusion technology in virtual reality interactive input systems, with the aim of improving the tracking accuracy of the system through optimized pose models and visual algorithms. A virtual reality measurement technology that integrates inertial measurement units and binocular vision is proposed by using an improved Kalman filtering algorithm to process inertial measurement units data, and combining it with the SURF algorithm optimized binocular vision system. Experimental results showed that fusion technology could reduce sensor noise and bias, improve the accuracy of pose estimation, and promote stable and robust motion tracking in virtual reality systems. In the angle range of -90 ° to 90 °, the average absolute error of the fusion system compared to the simple inertial measurement units pose calculation decreased from 3.371 ° to 1.369 ° In the experiment of measuring distance with a binocular vision system, the average absolute error value decreased to 1.532 mm, and the error range of the marker within the distance range of 500 mm to 650 mm was controlled within -1.3 mm to 1.2 mm. This study provides an effective solution for achieving high-precision virtual reality interactive input systems and is meaningful for the advancement of virtual reality technology.