Real-time underwater StereoFusion

Matija Rossi, Petar Trslic, Satja Sivcev, James Riordan, Daniel Toal, Gerard Dooly

Research output: Contribution to journalArticle

31 Downloads (Pure)

Abstract

Many current and future applications of underwater robotics require real-time sensing and interpretation of the environment. As the vast majority of robots are equipped with cameras, computer vision is playing an increasingly important role in this field. This paper presents the implementation and experimental results of underwater StereoFusion, an algorithm for real-time 3D dense reconstruction and camera tracking. Unlike KinectFusion on which it is based, StereoFusion relies on a stereo camera as its main sensor. The algorithm uses the depth map obtained from the stereo camera to incrementally build a volumetric 3D model of the environment, while simultaneously using the model for camera tracking. It has been successfully tested both in a lake and in the ocean, using two different state-of-the-art underwater Remotely Operated Vehicles (ROVs). Ongoing work focuses on applying the same algorithm to acoustic sensors, and on the implementation of a vision based monocular system with the same capabilities.
Original languageEnglish
Article number3936
Number of pages17
JournalSENSORS
Volume18
Issue number11
Early online date14 Nov 2018
DOIs
Publication statusE-pub ahead of print - 14 Nov 2018

Fingerprint

Cameras
cameras
Monocular Vision
Robotics
Lakes
Acoustics
Oceans and Seas
sensors
Sensors
computer vision
robotics
robots
lakes
Computer vision
oceans
vehicles
Robots
acoustics

Keywords

  • stereo
  • underwater
  • ROV
  • GPU
  • real-time
  • 3D
  • fusion
  • camera
  • tracking
  • vision

Cite this

Rossi, M., Trslic, P., Sivcev, S., Riordan, J., Toal, D., & Dooly, G. (2018). Real-time underwater StereoFusion. SENSORS, 18(11), [3936]. https://doi.org/10.3390/s18113936
Rossi, Matija ; Trslic, Petar ; Sivcev, Satja ; Riordan, James ; Toal, Daniel ; Dooly, Gerard. / Real-time underwater StereoFusion. In: SENSORS. 2018 ; Vol. 18, No. 11.
@article{92fd93afeb864b1bb17e007695aec1b0,
title = "Real-time underwater StereoFusion",
abstract = "Many current and future applications of underwater robotics require real-time sensing and interpretation of the environment. As the vast majority of robots are equipped with cameras, computer vision is playing an increasingly important role in this field. This paper presents the implementation and experimental results of underwater StereoFusion, an algorithm for real-time 3D dense reconstruction and camera tracking. Unlike KinectFusion on which it is based, StereoFusion relies on a stereo camera as its main sensor. The algorithm uses the depth map obtained from the stereo camera to incrementally build a volumetric 3D model of the environment, while simultaneously using the model for camera tracking. It has been successfully tested both in a lake and in the ocean, using two different state-of-the-art underwater Remotely Operated Vehicles (ROVs). Ongoing work focuses on applying the same algorithm to acoustic sensors, and on the implementation of a vision based monocular system with the same capabilities.",
keywords = "stereo, underwater, ROV, GPU, real-time, 3D, fusion, camera, tracking, vision",
author = "Matija Rossi and Petar Trslic and Satja Sivcev and James Riordan and Daniel Toal and Gerard Dooly",
note = "Gold Open Access journal article from date of publication (14/11/18) in compliance with REF 2021 Open Access Policy. APC paid by University of Limerick. HEI (UWS) will need to confirm that output was available open access immediately on publication via the gold route (para.329 REF 2021 final guidance).",
year = "2018",
month = "11",
day = "14",
doi = "10.3390/s18113936",
language = "English",
volume = "18",
journal = "SENSORS",
issn = "1424-8220",
publisher = "Multidisciplinary Digital Publishing Institute",
number = "11",

}

Rossi, M, Trslic, P, Sivcev, S, Riordan, J, Toal, D & Dooly, G 2018, 'Real-time underwater StereoFusion', SENSORS, vol. 18, no. 11, 3936. https://doi.org/10.3390/s18113936

Real-time underwater StereoFusion. / Rossi, Matija; Trslic, Petar ; Sivcev, Satja; Riordan, James; Toal, Daniel; Dooly, Gerard.

In: SENSORS, Vol. 18, No. 11, 3936, 14.11.2018.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Real-time underwater StereoFusion

AU - Rossi, Matija

AU - Trslic, Petar

AU - Sivcev, Satja

AU - Riordan, James

AU - Toal, Daniel

AU - Dooly, Gerard

N1 - Gold Open Access journal article from date of publication (14/11/18) in compliance with REF 2021 Open Access Policy. APC paid by University of Limerick. HEI (UWS) will need to confirm that output was available open access immediately on publication via the gold route (para.329 REF 2021 final guidance).

PY - 2018/11/14

Y1 - 2018/11/14

N2 - Many current and future applications of underwater robotics require real-time sensing and interpretation of the environment. As the vast majority of robots are equipped with cameras, computer vision is playing an increasingly important role in this field. This paper presents the implementation and experimental results of underwater StereoFusion, an algorithm for real-time 3D dense reconstruction and camera tracking. Unlike KinectFusion on which it is based, StereoFusion relies on a stereo camera as its main sensor. The algorithm uses the depth map obtained from the stereo camera to incrementally build a volumetric 3D model of the environment, while simultaneously using the model for camera tracking. It has been successfully tested both in a lake and in the ocean, using two different state-of-the-art underwater Remotely Operated Vehicles (ROVs). Ongoing work focuses on applying the same algorithm to acoustic sensors, and on the implementation of a vision based monocular system with the same capabilities.

AB - Many current and future applications of underwater robotics require real-time sensing and interpretation of the environment. As the vast majority of robots are equipped with cameras, computer vision is playing an increasingly important role in this field. This paper presents the implementation and experimental results of underwater StereoFusion, an algorithm for real-time 3D dense reconstruction and camera tracking. Unlike KinectFusion on which it is based, StereoFusion relies on a stereo camera as its main sensor. The algorithm uses the depth map obtained from the stereo camera to incrementally build a volumetric 3D model of the environment, while simultaneously using the model for camera tracking. It has been successfully tested both in a lake and in the ocean, using two different state-of-the-art underwater Remotely Operated Vehicles (ROVs). Ongoing work focuses on applying the same algorithm to acoustic sensors, and on the implementation of a vision based monocular system with the same capabilities.

KW - stereo

KW - underwater

KW - ROV

KW - GPU

KW - real-time

KW - 3D

KW - fusion

KW - camera

KW - tracking

KW - vision

U2 - 10.3390/s18113936

DO - 10.3390/s18113936

M3 - Article

VL - 18

JO - SENSORS

JF - SENSORS

SN - 1424-8220

IS - 11

M1 - 3936

ER -

Rossi M, Trslic P, Sivcev S, Riordan J, Toal D, Dooly G. Real-time underwater StereoFusion. SENSORS. 2018 Nov 14;18(11). 3936. https://doi.org/10.3390/s18113936