Indexed on: 04 Aug '18Published on: 01 Aug '18Published in: Journal of Intelligent & Robotic Systems
The application of robots as a tool to explore underwater environments has increased in the last decade. Underwater tasks such as inspection, maintenance, and monitoring can be automatized by robots. The understanding of the underwater environments and the object recognition are required features that are becoming a critical issue for these systems. On this work, a method to provide a semantic mapping on the underwater environment is provided. This novel system is independent of the water turbidity and uses acoustic images acquired by Forward-Looking Sonar (FLS). The proposed method efficiently segments and classifies the structures in the scene using geometric information of the recognized objects. Therefore, a semantic map of the scene is created, which allows the robot to describe its environment according to high-level semantic features. Finally, the proposal is evaluated in a real dataset acquired by an underwater vehicle in a marina area. Experimental results demonstrate the robustness and accuracy of the method described in this paper.