Skip to main navigation Skip to search Skip to main content

Deep Vision in Autonomous Underwater 3D Mapping for Biodiversity Assessment

Project: Research

Project Details

Description

This project aims to develop and implement an autonomous underwater drone system for high-resolution 3D mapping and monitoring of marine ecosystems, with a focus on habitats such as seaweed beds and coral reefs. Underwater drones are commonly used for marine monitoring and are typically equipped with a combination of sensors, including cameras, sonar, and environmental probes. However, their effectiveness is often limited by the challenging and dynamic nature of marine environments. Factors such as turbidity, low water clarity, and ocean currents introduce significant obstacles to consistent data collection and accurate visual analysis. In particular, many existing vision systems struggle with limited depth penetration and reduced image quality at greater depths, affecting the precision of species detection and habitat reconstruction. To address these issues, this project will implement an advanced vision-sensing system on an autonomous underwater drone, strategically combining panchromatic cameras (for high spatial resolution and light sensitivity) and hyperspectral imagers (for high spectral resolution) to achieve detailed 3D mapping of marine habitats. The research will focus on developing and refining algorithms for path planning, data processing with a 3D reconstruction technique based on Gaussian splatting, and object detection using various vision techniques. A key innovation is the integration of polarized hyperspectral imaging, which offers enhanced capability in distinguishing between different marine features and improving classification accuracy underwater. The research challenge is to deal with the different light attenuation per wavelength (e.g., blue light can penetrate deeper into the water than red light), which needs additional spectral reconstruction.

StatusActive
Effective start/end date1/06/2531/05/26