HKCoral

HKCoral: Benchmark for Dense Coral Growth Form Segmentation in the Wild

Ziqiang Zheng1    Haixin Liang1    Fong Hei Wut2    Yue Him Wong3    Apple Pui Yi CHUI2    Sai-Kit Yeung1

1The Hong Kong University of Science and Technology, 2The Chinese University of Hong Kong, 3Shenzhen University

IEEE Journal of Oceanic Engineering, 2024


The illustration of the collected HKCoral dataset, which could contribute towards dense coral growth form segmentation. Our dataset contains 2,515 images with dense pixel-wise coral growth form annotations. The images are collected under challenging in-the-wild conditions (e.g., low visibility, background clutter, motion blur, occlusion, dynamic illumination, color distortion, and optical artifacts).



Abstract

Underwater coral reef monitoring plays an important role in the maintenance and protection of the underwater ecosystem. Extracting information from the collected coral reef images and videos based on computer vision techniques has recently gained increasing attention. Semantic segmentation, which assigns semantic category information to each pixel in images, has been introduced to understand the coral reefs. Satisfactory semantic segmentation performance has been achieved based on large-scale in-air datasets with densely labeled annotations. However, the underwater coral reef understanding is less explored and the existing underwater coral reef datasets are mainly captured under ideal and normal conditions and lack variance. They cannot fully reflect the diversity and properties of coral reefs. Thus, the trained coral reef segmentation model shows a very limited performance when deployed in practical, challenging, and adverse conditions. To address these issues, in this paper, we propose an \textbf{in-the-wild} coral reef dataset named HKCoral to close the gap for performing in-situ coral reef monitoring. The collected dataset with dense pixel-wise annotations possesses larger diversity, appearance, viewpoint, and visibility variations. Besides, we adopt the fundamental coral growth form as the foundation of our semantic coral reef segmentation, which enables a strong generalization ability to unseen coral reef images from different sites. We benchmark the coral reef segmentation performance of 17 state-of-the-art semantic segmentation algorithms (including the recent generalist Segment Anything model) and further introduce a complementary architecture to better utilize the underwater image enhancement for improving the segmentation performance of models. We have conducted extensive experiments based on various up-to-date segmentation models on our benchmark and the experimental results demonstrate that there is still ample room to improve the coral segmentation performance. The ablation studies and discussions are also included. The proposed benchmark could significantly enhance the efficiency and accuracy of real-world underwater coral reef surveying.


Illustration of coral growth form

The illustration of underwater coral growth form segmentation. Different colors indicate different growth forms.


The cropped feature patterns of 6 coral growth forms. Best viewed in color and zoom in for checking more details.


Qualitative Results
Benchmarking various semantic segmentation algorithms

The performance comparison between different semantic segmentation algorithms.


Comparison with SAM

The coral segmentation comparisons between the generalist segmentation model SAM and our proposed method. We observe that the SAM cannot yield accurate coral masks.


The zero-shot generalization ability

The coral segmentation results of the unseen coral reef images. a), b), and c) are from the YouTube videos collected at the Red Sea. d) and e) are from the YouTube videos collected in Hong Kong. f) is from the Great Barrier Reef dataset. g) is from the CoralNet. h) is from the Google image engine while i) is from the Mosaics UCSD dataset.


Citation
        @inproceedings{ziqiang2024hkcoral,
        title={HKCoral: Benchmark for Dense Coral Growth Form Segmentation in the Wild},
        author={Ziqiang Zheng, Haixin Liang, Fong Hei Wut, Yue Him Wong, Apple Pui Yi CHUI, Sai-Kit Yeung},
        booktitle={IEEE Journal of Oceanic Engineering (JOE)},
        year={2024},
    }