Title: Evaluation of an object detection system in the submarine environment
Authors: Rekik, Farah
Ayedi, Walid
Jallouli, Mohammed
Citation: WSCG 2017: poster papers proceedings: 25th International Conference in Central Europe on Computer Graphics, Visualization and Computer Visionin co-operation with EUROGRAPHICS Association, p. 25-30.
Issue Date: 2017
Publisher: Václav Skala - UNION Agency
Document type: konferenční příspěvek
conferenceObject
URI: wscg.zcu.cz/WSCG2017/!!_CSRN-2703.pdf
http://hdl.handle.net/11025/29608
ISBN: 978-80-86943-46-6
ISSN: 2464-4617
Keywords: detekce objektů;detekce potrubí;podvodní zobrazování;deskriptor;klasifikátor
Keywords in different language: objects detection;pipe detection;underwater imaging;descriptor;classifier
Abstract: The object detection in underwater environment requires a perfect description of the image with appropriate features, in order to extract the right object of interest. In this paper we adopt a novel underwater object detection algorithm based on multi-scale covariance descriptor (MSCOV) for the image description and feature extraction, and support vector machine classifier (SVM) for the data classification. This approach is evaluated in pipe detection application using MARIS dataset. The result of this algorithm outperforms existing detection system using the same dataset. Computer vision in underwater environment suffers from absorption and scattering of light in water. Despite the work carried out so far, image preprocessing is the only solution to cope with this problem. This step creates a waste of time and requires hardware and software resources. But the proposed method does not require pretreatment so it accelerate the process.
Rights: © Václav Skala - Union Agency
Appears in Collections:WSCG 2017: Poster Papers Proceedings

Files in This Item:
File Description SizeFormat 
Rekik.pdfPlný text781,27 kBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/29608

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.