Pretrained Backbones with UNet
Overview
This is a simple package for semantic segmentation with UNet and pretrained backbones. This package utilizes the timm models for the pre-trained encoders.
When dealing with relatively limited datasets, initializing a model using pre-trained weights from a large dataset can be an excellent choice for ensuring successful network training. By utilizing state-of-the-art models, such as ConvNeXt, as an encoder, you can effortlessly solve the problem at hand while achieving optimal performance in this context.
The primary characteristics of this library are as follows:
-
430 pre-trained backbone networks are available for the UNet semantic segmentation model.
-
Supports backbone networks such as ConvNext, ResNet, EfficientNet, DenseNet, RegNet, and VGG... which are popular and SOTA performers, for the UNet model.
-
It is possible to adjust which layers of the backbone of the model are trainable parametrically.
-
It includes a DataSet class for binary and multi-class semantic segmentation.
-
And it comes with a pre-built rapid custom training class.
Installation
Pypi version:
pip install pretrained-backbones-unet
Source code version:
pip install git+https://github.com/mberkay0/pretrained-backbones-unet
Usage
from backbones_unet.model.unet import Unet
from backbones_unet.utils.dataset import SemanticSegmentationDataset
from backbones_unet.model.losses import DiceLoss
from backbones_unet.utils.trainer import Trainer
train_img_path = 'example_data/train/images'
train_mask_path = 'example_data/train/masks'
val_img_path = 'example_data/val/images'
val_mask_path = 'example_data/val/masks'
train_dataset = SemanticSegmentationDataset(train_img_path, train_mask_path)
val_dataset = SemanticSegmentationDataset(val_img_path, val_mask_path)
train_loader = DataLoader(train_dataset, batch_size=2)
val_loader = DataLoader(val_dataset, batch_size=2)
model = Unet(
backbone='convnext_base',
in_channels=3,
num_classes=1,
)
params = [p for p in model.parameters() if p.requires_grad]
optimizer = torch.optim.AdamW(params, 1e-4)
trainer = Trainer(
model,
criterion=DiceLoss(),
optimizer,
10
)
trainer.fit(train_loader, val_loader)
Available Pretrained Backbones
import backbones_unet
print(backbones_unet.__available_models__)