Segment Anything 2 (SAM 2) β ONNX Models
ONNX-exported versions of Meta's Segment Anything Model 2 (SAM 2), ready for CPU/GPU inference with ONNX Runtime β no PyTorch required at runtime.
These models are used by AnyLabeling for AI-assisted image annotation, and exported by samexporter.
Looking for SAM 2.1? See vietanhdev/segment-anything-2.1-onnx-models β an improved version with better accuracy.
Available Models
| File | Variant | Notes |
|---|---|---|
sam2_hiera_tiny.zip |
SAM 2 Hiera-Tiny | Smallest, fastest |
sam2_hiera_small.zip |
SAM 2 Hiera-Small | Good balance |
sam2_hiera_base_plus.zip |
SAM 2 Hiera-Base+ | Higher accuracy |
sam2_hiera_large.zip |
SAM 2 Hiera-Large | Most accurate |
Each zip contains two ONNX files: an encoder (runs once per image) and a decoder (runs interactively for each prompt).
Prompt Types
- Point (
+point/-point): click to include/exclude regions - Rectangle: draw a bounding box around the target object
Use with AnyLabeling (Recommended)
AnyLabeling is a desktop annotation tool with a built-in model manager that downloads, caches, and runs these models automatically β no coding required.
- Install:
pip install anylabeling - Launch:
anylabeling - Click the Brain button β select a Segment Anything 2 model from the dropdown
- Use point or rectangle prompts to segment objects
Use Programmatically with ONNX Runtime
import urllib.request, zipfile
url = "https://huggingface.co/vietanhdev/segment-anything-2-onnx-models/resolve/main/sam2_hiera_tiny.zip"
urllib.request.urlretrieve(url, "sam2_hiera_tiny.zip")
with zipfile.ZipFile("sam2_hiera_tiny.zip") as z:
z.extractall("sam2_hiera_tiny")
Then use samexporter's inference module:
pip install samexporter
python -m samexporter.inference \
--encoder_model sam2_hiera_tiny/sam2_hiera_tiny.encoder.onnx \
--decoder_model sam2_hiera_tiny/sam2_hiera_tiny.decoder.onnx \
--image photo.jpg \
--prompt prompt.json \
--output result.png \
--sam_variant sam2
Re-export from Source
To re-export or customize the models using samexporter:
pip install samexporter
pip install git+https://github.com/facebookresearch/segment-anything-2.git
# Download SAM 2 checkpoints
cd original_models && bash download_sam2.sh && cd ..
# Export Tiny variant
python -m samexporter.export_sam2 \
--checkpoint original_models/sam2_hiera_tiny.pt \
--output_encoder output_models/sam2_hiera_tiny.encoder.onnx \
--output_decoder output_models/sam2_hiera_tiny.decoder.onnx \
--model_type sam2_hiera_tiny
# Or convert all SAM 2 variants at once:
bash convert_all_meta_sam2.sh
Related Repositories
| Repo | Description |
|---|---|
| vietanhdev/samexporter | Export scripts, inference code, conversion tools |
| vietanhdev/anylabeling | Desktop annotation app powered by these models |
| vietanhdev/segment-anything-2.1-onnx-models | Improved SAM 2.1 ONNX models |
| facebookresearch/segment-anything-2 | Original SAM 2 by Meta |
License
The ONNX models are derived from Meta's SAM 2, released under the Apache 2.0 license. The export code is part of samexporter, released under the MIT license.
