we do not have a full checkpoint conversion validation, if you encounter pipeline loading failure and unsidered output, please contact me via bili_sakura@zju.edu.cn
CUT-OpenEarthMap-SAR
CUT (Contrastive Unpaired Translation) models for SAR β optical image translation. Trained on OpenEarthMap data with anti-aliased ResNet generators.
Model variants
| Model | Direction | Epoch |
|---|---|---|
opt2sar |
Optical β SAR | 20 |
sar2opt |
SAR β Optical | 15 |
seman2opt |
Semantic β Optical | 25 |
seman2opt_pesudo |
Semantic (pseudo) β Optical | 195 |
seman2sar |
Semantic β SAR | 25 |
seman2sar_pesudo |
Semantic (pseudo) β SAR | 200 |
Usage
Use with pytorch-image-translation-models and the openearthmap_sar community pipeline:
from PIL import Image
from examples.community.openearthmap_sar import load_openearthmap_sar_pipeline
pipeline = load_openearthmap_sar_pipeline(
checkpoint_dir="/path/to/CUT-OpenEarthMap-SAR",
model_name="sar2opt", # choose one: opt2sar, sar2opt, seman2opt, seman2opt_pesudo, seman2sar, seman2sar_pesudo
device="cuda",
)
source = Image.open("/path/to/sar.png").convert("RGB")
output = pipeline(source_image=source, output_type="pil")
output.images[0].save("cut_sar2opt.png")
CLI:
python -m examples.community.openearthmap_sar \
--checkpoint-dir BiliSakura/CUT-OpenEarthMap-SAR \
--model sar2opt \
--input sar.png \
--output out.png
Pass
source_imageasPIL.Image. The generator uses anti-aliased down/upsampling to match the original CUT training.
Repository layout
{model_name}/
generator/
config.json
diffusion_pytorch_model.safetensors
Architecture
| Parameter | Value |
|---|---|
in_channels |
3 |
out_channels |
3 |
base_filters |
64 |
n_blocks |
9 |
norm |
InstanceNorm |
Citation
CUT (architecture):
@inproceedings{park2020cut,
title={Contrastive Learning for Unpaired Image-to-Image Translation},
author={Park, Taesung and Efros, Alexei A and Zhang, Richard and Zhu, Jun-Yan},
booktitle={ECCV},
year={2020}
}
OpenEarthMap-SAR (dataset & baseline):
@ARTICLE{11303033,
author={Xia, Junshi and Chen, Hongruixuan and Broni-Bediako, Clifford and Wei, Yimin and Song, Jian and Yokoya, Naoto},
journal={IEEE Geoscience and Remote Sensing Magazine},
title={OpenEarthMap-SAR: A benchmark synthetic aperture radar dataset for global high-resolution land cover mapping [Software and Data Sets]},
year={2025},
volume={13},
number={4},
pages={476-487},
keywords={Translation;Semantic segmentation;Source coding;Urban planning;Land surface;Geoscience and remote sensing;Benchmark testing;Software;Sustainable development;Synthetic aperture radar},
doi={10.1109/MGRS.2025.3599512}
}
Credits
Models trained on the OpenEarthMap-SAR benchmark dataset. Thanks to the authors for the dataset and CUT baseline.
- Downloads last month
- -