Skip to content

Drchip61/MAS-SAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

MAS-SAM(IJCAI2024)

MAS-SAM: Segment Any Marine Animal with Aggregated Features

Tianyu Yan1,Zifu Wan2,Xinhao Deng1,*Pingping Zhang✉️1,Yang Liu1, Huchuan Lu1

Dalian University of Technology, IIAU-Lab1

Robotics Institute, Carnegie Mellon University2

Abstract

Recently, Segment Anything Model (SAM) shows exceptional performance in generating high-quality object masks and achieving zero-shot image segmentation. However, as a versatile vision model, SAM is primarily trained with large-scale natural light images. In underwater scenes, it exhibits substantial performance degradation due to the light scattering and absorption. Meanwhile, the simplicity of the SAM’s decoder might lead to the loss of fine-grained object details. To address the above issues, we propose a novel feature learning framework named MAS-SAM for marine animal segmentation, which involves integrating effective adapters into the SAM’s encoder and constructing a pyramidal decoder. More specifically, we first build a new SAM’s encoder with effective adapters for underwater scenes. Then, we introduce a Hypermap Extraction Module (HEM) to generate multi-scale features for a comprehensive guidance. Finally, we propose a Progressive Prediction Decoder (PPD) to aggregate the multi-scale features and predict the final segmentation results. When grafting with the Fusion Attention Module (FAM), our method enables to extract richer marine information from global contextual cues to fine-grained local details. Extensive experiments on four public MAS datasets demonstrate that our MAS-SAM can obtain better results than other typical segmentation methods.

Getting Started

Installation

step1:Clone the Dual_SAM repository:

To get started, first clone the Dual_SAM repository and navigate to the project directory:

git clone https://github.com/Drchip61/MAS_SAM.git
cd MAS_SAM

step2:Environment Setup:

Dual_SAM recommends setting up a conda environment and installing dependencies via pip. Use the following commands to set up your environment:

Create and activate a new conda environment

conda create -n MAS_SAM
conda activate MAS_SAM

Install Dependencies.

pip install -r requirements.txt

Download pretrained model.

Please put the pretrained SAM model in the Dual-SAM file.

Model Training and Testing

Training

# Change the hyper parameter in the train_s.py 
python train_y.py

Testing

# Change the hyper parameter in the test_y.py 
python test_y.py

Analysis Tools

# First threshold the prediction mask
python bimap.py
# Then evaluate the perdiction mask
python test_score.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages