Assessing the ability of deep learning techniques to perform real-time identification of shark species in live streaming video from drones

Cormac R. Purcell*, Andrew J. Walsh, Andrew P. Colefax, Paul Butcher

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)
22 Downloads (Pure)

Abstract

Over the last five years remotely piloted drones have become the tool of choice to spot potentially dangerous sharks in New South Wales, Australia. They have proven to be a more effective, accessible and cheaper solution compared to crewed aircraft. However, the ability to reliably detect and identify marine fauna is closely tied to pilot skill, experience and level of fatigue. Modern computer vision technology offers the possibility of improving detection reliability and even automating the surveillance process in the future. In this work we investigate the ability of commodity deep learning algorithms to detect marine objects in video footage from drones, with a focus on distinguishing between shark species. This study was enabled by the large archive of video footage gathered during the NSW Department of Primary Industries Drone Trials since 2016. We used this data to train two neural networks, based on the ResNet-50 and MobileNet V1 architectures, to detect and identify ten classes of marine object in 1080p resolution video footage. Both networks are capable of reliably detecting dangerous sharks: 80% accuracy for RetinaNet-50 and 78% for MobileNet V1 when tested on a challenging external dataset, which compares well to human observers. The object detection models correctly detect and localise most objects, produce few false-positive detections and can successfully distinguish between species of marine fauna in good conditions. We find that shallower network architectures, like MobileNet V1, tend to perform slightly worse on smaller objects, so care is needed when selecting a network to match deployment needs. We show that inherent biases in the training set have the largest effect on reliability. Some of these biases can be mitigated by pre-processing the data prior to training, however, this requires a large store of high resolution images that supports augmentation. A key finding is that models need to be carefully tuned for new locations and water conditions. Finally, we built an Android mobile application to run inference on real-time streaming video and demonstrated a working prototype during fields trials run in partnership with Surf Life Saving NSW.

Original languageEnglish
Article number981897
Pages (from-to)1-26
Number of pages26
JournalFrontiers in Marine Science
Volume9
DOIs
Publication statusPublished - 20 Oct 2022

Bibliographical note

Copyright © 2022 Purcell, Walsh, Colefax and Butcher. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • aerial survey
  • computer vision
  • deep learning
  • drone
  • remotely piloted aerial systems
  • RPAS
  • UAV
  • unmanned aerial vehicle

Fingerprint

Dive into the research topics of 'Assessing the ability of deep learning techniques to perform real-time identification of shark species in live streaming video from drones'. Together they form a unique fingerprint.

Cite this