Robust and Fair Undersea Target Detection with Automated Underwater Vehicles for Biodiversity Data Collection

Ranjith Dinakaran, Li Zhang, Chang-Tsun Li, Ahmed Bouridane, Richard Jiang

Research output: Contribution to journalArticlepeer-review

2 Downloads (Pure)


Undersea/subsea data collection via automated underwater vehicles (AUVs) plays an important role for marine biodiversity research, while it is often much more challenging than the data collection above ground via satellites or AUVs. To enable the automated undersea/subsea data collection system, the AUVs are expected to be able to automatically track the objects of interest through what they can “see” from their mounted underwater cameras, where videos or images could be drastically blurred and degraded in underwater lighting conditions. To solve this chal-lenge, in this work, we propose a cascaded framework by combining a DCGAN (deep convolutional generative adversarial network) with an object detector, i.e., single-shot detector (SSD), named DCGAN+SSD, for the detection of various underwater targets from the mounted camera of an automated underwater vehicle. In our framework, our assumption is that DCGAN can be lev-eraged to alleviate the impact of underwater conditions and provide the object detector with a better performance for automated AUVs. To optimize the hyperparameters of our models, we ap-plied a particle swarm optimization (PSO)-based strategy to improve the performance of our pro-posed model. In our experiments, we successfully verified our assumption that the DCGAN+SSD architecture can help improve the object detection toward the undersea conditions and achieve apparently better detection rates over the original SSD detector. Further experiments showed that the PSO-based optimization of our models could further improve the model in object detection toward a more robust and fair performance, making our work a promising solution for tackling the challenges in AUVs.
Original languageEnglish
Article number3680
Number of pages17
JournalRemote Sensing
Issue number15
Publication statusPublished - 1 Aug 2022

Cite this