Repository for ODS Summer of Code event with preprocessing and segmentation pipelines.
Preprocessing pipeline uses GDAL and OpenCV libraries. Segmentation pipeline uses PyTorch and Segmentation Models frameworks.
It is supposed to be used with Google Colaboratory (File -> Open notebook -> GitHub).
Kaggle Kernels – is a cloud-based development environment similar to Google Colab. Kaggle provides free access to the GPU for 37 hours per week.
-
Download and upload notebook
wget https://raw.githubusercontent.com/MaritimeAI/ODS-SoC/master/segmentation.kaggle.ipynb
Open Kaggle → Create → New Notebook → File → Upload Notebook → segmentation.kaggle.ipynb
-
Add dataset
File → Add or upload data
-
Use Kaggle secrets to store your API key wandb for experiment tracking and models managment
Add-ons → Secrets → Add a new secrets → Label ('wandb') → Value (Key)
-
Configuration settings
config = { 'classes': ['nodata', 'water', 'ice'], 'batch_size_train': 1, 'batch_size_valid': 1, 'num_workers_train': 1, 'num_workers_valid': 1, 'model_encoder': 'ResNet32', 'model_pretrain': 'ImageNet', 'model_channels': 3, 'data_split': 1, 'expand': True, 'debug': True, 'flat': True, }
- data_split - cross-validation for 5 folds or comment this line
- debug - test run for one image
- model_encoder - choose any from list encoders
-
Choose architecture model
Notebook → section Model
model = smp.MAnet(encoder_name=config['model_encoder'].lower(), encoder_weights=config['model_pretrain'].lower(), in_channels=config['model_channels'], classes=len(CLASSES))
-
Save weights model
Save version → Advenced settings → Always save output → Quick save → Save
-
Upload weights and trained model
File → Add or upload data → Notebook Output Files → Your work → Chosse previos version
In section Paths copy notebook name to save the weights to working directory
NOTEBOOK_NAME = ''
In section Train loop add last saved model
NAME_PRELOAD = ''