Embedded multibeam sonar feature extraction for online AUV control

Matthieu Sion, Torsten Teubler, Horst Hellbrück

Abstract

For development of an intelligent unmanned autonomous underwater vehicle (AUV), sensor data needs to be processed online for navigation and mission planning. In this work, we suggest a complete workflow and a processing chain to retrieve multibeam sonar data for AUV control. Our approach is based on the well-known image processing library OpenCV which provides sophisticated image recognition algorithms. We implement a processing chain for feature extraction on multibeam sonar acoustic image which retrieves contours of objects and coordinate points of the contours. Coordinate points are discrete data which can be easily processed further with additional algorithms. E.g. size of objects can be determined with the coordinate points or an expert system can classify objects with help of the coordinate points. Our solution will be embedded in an online control of an AUV. We evaluate the performance of our feature extraction approach using pre-recorded sonar data.

Original languageEnglish
Title of host publicationOCEANS 2016 - Shanghai
PublisherIEEE
Publication date03.06.2016
Article number7485466
ISBN (Electronic)978-1-4673-9724-7
DOIs
Publication statusPublished - 03.06.2016
EventOCEANS 2016 - Shanghai - Shanghai, China
Duration: 10.04.201613.04.2016
Conference number: 122196

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being
  2. SDG 9 - Industry, Innovation, and Infrastructure
    SDG 9 Industry, Innovation, and Infrastructure
  3. SDG 11 - Sustainable Cities and Communities
    SDG 11 Sustainable Cities and Communities
  4. SDG 12 - Responsible Consumption and Production
    SDG 12 Responsible Consumption and Production

Fingerprint

Dive into the research topics of 'Embedded multibeam sonar feature extraction for online AUV control'. Together they form a unique fingerprint.

Cite this