GrapHist: Graph Self-Supervised Learning for Histopathology

This repository contains the pre-trained model from the paper GrapHist: Graph Self-Supervised Learning for Histopathology.

Pre-trained on the graph-tcga-brca dataset, it employs an ACM-GIN (Adaptive Channel Mixing Graph Isomorphism Network) encoder-decoder architecture with a masked node attribute prediction objective.

GrapHist architecture

Repository Structure

graphist/
β”œβ”€β”€ graphist.pt            # Pre-trained model checkpoint
β”œβ”€β”€ graphist.png           # Architecture overview
β”œβ”€β”€ models/
β”‚   β”œβ”€β”€ __init__.py        # build_model(args) factory
β”‚   β”œβ”€β”€ edcoder.py         # PreModel encoder-decoder wrapper
β”‚   β”œβ”€β”€ acm_gin.py         # ACM-GIN backbone (encoder/decoder)
β”‚   └── utils.py           # Activation and normalization helpers
└── README.md

Requirements

pip install torch torch-geometric huggingface_hub

The model expects graphs in PyTorch Geometric format with x, edge_index, edge_attr, and batch.

Usage

1. Clone the repository

from huggingface_hub import snapshot_download

repo_path = snapshot_download(repo_id="ogutsevda/graphist")

2. Build and load the model

import sys, torch
sys.path.insert(0, repo_path)

from models import build_model

class Args:
    encoder = "acm_gin"
    decoder = "acm_gin"
    drop_edge_rate = 0.0
    mask_rate = 0.5
    replace_rate = 0.1
    num_hidden = 512
    num_layers = 5
    num_heads = 4
    num_out_heads = 1
    residual = None
    attn_drop = 0.1
    in_drop = 0.2
    norm = None
    negative_slope = 0.2
    batchnorm = False
    activation = "prelu"
    loss_fn = "sce"
    alpha_l = 3
    concat_hidden = True
    num_features = 46
    num_edge_features = 1

args = Args()
model = build_model(args)
checkpoint = torch.load(f"{repo_path}/graphist.pt", weights_only=False)
model.load_state_dict(checkpoint["model_state_dict"])
model.eval()

3. Generate embeddings

with torch.no_grad():
    embeddings = model.embed(
        batch.x, batch.edge_index, batch.edge_attr, batch.batch
    )

Acknowledgements

The model architecture adapts code from GraphMAE and ACM-GNN.

Citation

@misc{ogut2026graphist,
    title={GrapHist: Graph Self-Supervised Learning for Histopathology}, 
    author={Sevda Γ–ΔŸΓΌt and CΓ©dric Vincent-Cuaz and Natalia Dubljevic and Carlos Hurtado and Vaishnavi Subramanian and Pascal Frossard and Dorina Thanou},
    year={2026},
    eprint={2603.00143},
    url={https://arxiv.org/abs/2603.00143}, 
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Dataset used to train ogutsevda/graphist

Paper for ogutsevda/graphist