This project evaluates visual odometry accuracy by comparing it to OptiTrack ground truth data. We conducted multiple experiments in different environments to analyse how factors like objects, featureless areas, and Z-axis changes affect motion estimation. Euclidean distance was used to calculate the error between the estimated trajectory and OptiTrack data.
To improve the F1TENTH car’s performance, we optimized its ERPM gain by running tests with different gain values, calculating the relative error, and fitting the data to find the optimal value that minimized error. Similarly, we optimized the steering angle gain by ensuring the car could complete a full circular trajectory with a specific radius. This was achieved by testing different steering gains and fitting an arctan function to determine the most accurate gain.
The results showed that objects improved accuracy, while featureless areas and reflective surfaces increased errors. Loop closure reduced drift, whereas non-overlapping paths led to higher inaccuracies. Z-axis variations, especially when the camera was pointed at the ground, caused further localization errors.
Localization works better in areas with more objects, as they provide clear reference points for accurate motion tracking and reduce errors. We visualized our findings through trajectory and error graphs, highlighting odometry performance in different conditions.

