Scientific Reports (Aug 2023)
Radar sensor based machine learning approach for precise vehicle position estimation
Abstract
Abstract Estimating vehicles’ position precisely is essential in Vehicular Adhoc Networks (VANETs) for their safe, autonomous, and reliable operation. The conventional approaches used for vehicles’ position estimation, like Global Positioning System (GPS) and Global Navigation Satellite System (GNSS), pose significant data delays and data transmission errors, which render them ineffective in achieving precision in vehicles’ position estimation, especially under dynamic environments. Moreover, the existing radar-based approaches proposed for position estimation utilize the static values of range and azimuth, which make them inefficient in highly dynamic environments. In this paper, we propose a radar-based relative vehicle positioning estimation method. In the proposed method, the dynamic range and azimuth of a Frequency Modulated Continuous Wave radar is utilized to precisely estimate a vehicle’s position. In the position estimation process, the speed of the vehicle equipped with the radar sensor, called the reference vehicle, is considered such that a change in the vehicle’s speed changes the range and azimuth of the radar sensor. For relative position estimation, the distance and relative speed between the reference vehicle and a nearby vehicle are used. To this end, only those vehicles are considered that have a higher possibility of coming in contact with the reference vehicle. The data recorded by the radar sensor is subsequently utilized to calculate the precision and intersection Over Union (IOU) values. You Only Look Once (YOLO) version 4 is utilized to calculate precision and IOU values from the data captured using the radar sensor. The performance is evaluated under various real-time traffic scenarios in a MATLAB-based simulator. Results show that our proposed method achieves 80.0% precision in position estimation and obtains an IOU value up to 87.14%, thereby outperforming the state-of-the-art.