Sensor Fusion for Improved Localization in Autonomous Underwater Vehicle
The ocean has long been a vital resource for humankind, providing essential natural resources such as renewable energy, marine minerals, and seafood. However, marine ecosystems are increasingly threatened by global warming and pollution, which require extensive studies to monitor and mitigate these effects. Additionally, ocean research aids scientists in understanding the Earth's ecosystem and enhances their ability to predict and respond to natural hazards such as earthquakes and tsunamis. Despite its significance, oceanographic research has received comparatively less attention than land and aerial studies due to the challenges associated with accessing and mapping aquatic ecosystems. Therefore, developing efficient ocean survey methods that enable rapid and comprehensive exploration of marine environments is crucial.
Traditional ocean surveys conducted by human divers are time-consuming, costly, and inherently hazardous due to unpredictable underwater conditions and the presence of venomous marine life. Autonomous Underwater Vehicles (AUVs) present a viable alternative, offering a robotic platform equipped with power systems and onboard processing units capable of navigating unknown underwater environments without human intervention. AUVs have been successfully deployed for large-scale ocean surveys, such as mapping the Great Barrier Reef to assess reef health and marine biodiversity. However, navigating complex and unstructured environments, such as coral reef systems, requires precise self-localization to avoid obstacles and ensure efficient mission execution. Unlike terrestrial autonomous systems, which rely on the Global Positioning System (GPS), AUVs must depend on onboard sensors for localization, as electromagnetic signals are heavily attenuated underwater.
Different types of sensors provide complementary information, but relying on a single sensor often results in significant localization errors due to noise and environmental disturbances. Sensor fusion techniques integrate data from multiple sources to mitigate noise and enhance localization accuracy. A well-designed fusion algorithm is essential to effectively combine sensor data and optimize the AUV’s self-localization performance. Additionally, the dynamic and unpredictable nature of underwater environments poses significant challenges in developing and training localization algorithms. Conducting field tests is often dangerous, resource-intensive, and impractical for large-scale experimentation.
To address these challenges, high-fidelity underwater simulation frameworks are essential for creating realistic virtual testing environments. These simulation frameworks accurately model AUV dynamics, sensor behavior, and provide realistic rendering of underwater environment, enabling safe and cost-effective underwater self-localization algorithm development. In this research, we introduce an underwater simulation framework designed for evaluating AUV self-localization algorithms, incorporating a noise modeling structure to simulate various sensor uncertainties. We then investigate the impact of sensor fusion techniques on localization accuracy and systematically compare different fusion algorithms to identify the most effective approach. Our research evaluates four localization algorithms for AUVs—IMU-only, stereo camera, LiDAR-only, and LiDAR-IMU—within a realistic underwater simulation across three mission types: lawnmower, loop-closure, and adaptive path. Results show that LiDAR-only localization performs best in feature-rich environments for short-term missions, while LiDAR-IMU excels in long-term or feature-sparse scenarios. Stereo localization offers a budget-friendly alternative with acceptable accuracy in feature-rich areas. A key limitation is computational constraints due to limited GPU power, restricting simulation realism and runtime to 120 seconds. Future research should develop more realistic noise models and refine AUV hydrodynamics to better capture real-world conditions.