GrapHist: Graph Self-Supervised Learning for Histopathology
Paper
β’ 2603.00143 β’ Published
This repository contains the pre-trained model from the paper GrapHist: Graph Self-Supervised Learning for Histopathology.
Pre-trained on the graph-tcga-brca dataset, it employs an ACM-GIN (Adaptive Channel Mixing Graph Isomorphism Network) encoder-decoder architecture with a masked node attribute prediction objective.
graphist/
βββ graphist.pt # Pre-trained model checkpoint
βββ graphist.png # Architecture overview
βββ models/
β βββ __init__.py # build_model(args) factory
β βββ edcoder.py # PreModel encoder-decoder wrapper
β βββ acm_gin.py # ACM-GIN backbone (encoder/decoder)
β βββ utils.py # Activation and normalization helpers
βββ README.md
pip install torch torch-geometric huggingface_hub
The model expects graphs in PyTorch Geometric format with x, edge_index, edge_attr, and batch.
from huggingface_hub import snapshot_download
repo_path = snapshot_download(repo_id="ogutsevda/graphist")
import sys, torch
sys.path.insert(0, repo_path)
from models import build_model
class Args:
encoder = "acm_gin"
decoder = "acm_gin"
drop_edge_rate = 0.0
mask_rate = 0.5
replace_rate = 0.1
num_hidden = 512
num_layers = 5
num_heads = 4
num_out_heads = 1
residual = None
attn_drop = 0.1
in_drop = 0.2
norm = None
negative_slope = 0.2
batchnorm = False
activation = "prelu"
loss_fn = "sce"
alpha_l = 3
concat_hidden = True
num_features = 46
num_edge_features = 1
args = Args()
model = build_model(args)
checkpoint = torch.load(f"{repo_path}/graphist.pt", weights_only=False)
model.load_state_dict(checkpoint["model_state_dict"])
model.eval()
with torch.no_grad():
embeddings = model.embed(
batch.x, batch.edge_index, batch.edge_attr, batch.batch
)
The model architecture adapts code from GraphMAE and ACM-GNN.
@misc{ogut2026graphist,
title={GrapHist: Graph Self-Supervised Learning for Histopathology},
author={Sevda ΓΔΓΌt and CΓ©dric Vincent-Cuaz and Natalia Dubljevic and Carlos Hurtado and Vaishnavi Subramanian and Pascal Frossard and Dorina Thanou},
year={2026},
eprint={2603.00143},
url={https://arxiv.org/abs/2603.00143},
}