Brain-Harmony Pretrained Weights (SafeTensors)
Pretrained weights for the Brain-Harmony multimodal brain foundation model, converted to SafeTensors format for use with the brainharmony-rs Rust inference crate.
Models
| File | Model | Description | Params | Size |
|---|---|---|---|---|
harmonizer.safetensors |
OneTokRegViT | Stage 1 pretrained encoder-decoder (fMRI + T1) | 90M | 466 MB |
harmonix-f.safetensors |
FlexVisionTransformer | fMRI encoder + JEPA predictor | ~150M | 723 MB |
harmonix-s.safetensors |
OneTokRegViT | T1 structural encoder-decoder | 85M | 448 MB |
Position Embedding Files
| File | Description | Shape |
|---|---|---|
gradient_mapping_400.csv |
Brain gradient coordinates (30 axes) | 400 ROIs x 30 |
schaefer400_roi_eigenmodes.csv |
Geometric harmonics (Schaefer 400 parcellation) | 400 ROIs x 200 |
Architecture
Brain-Harmony is a ViT-Base encoder (12 layers, 768-dim, 12 heads) that processes parcellated brain signals through:
- FlexiPatchEmbed: Conv2d with dynamic patch size (default 48)
- Brain gradient + geometric harmonics positional embeddings: combines spatial gradient mapping with cortical eigenmode projections
- JEPA framework: self-supervised pretraining with masked prediction
Input: [B, 1, 400, 864] (400 cortical ROIs x 18 patches x 48 timepoints)
Output: [B, 7200, 768] latent embeddings
Usage with Rust (brainharmony-rs)
use brainharmony::{BrainHarmonyEncoder, ModelConfig, DataConfig};
use burn::backend::NdArray;
type B = NdArray;
let device = burn::backend::ndarray::NdArrayDevice::Cpu;
let (encoder, ms) = BrainHarmonyEncoder::<B>::from_weights(
"harmonizer.safetensors",
"gradient_mapping_400.csv",
"schaefer400_roi_eigenmodes.csv",
&ModelConfig::default(),
&DataConfig::default(),
&device,
)?;
let result = encoder.encode_safetensors("input_signal.safetensors")?;
result.save_safetensors("embeddings.safetensors")?;
Usage with Python (Brain-Harmony)
import torch
from safetensors.torch import load_file
weights = load_file("harmonizer.safetensors")
# Load into your Brain-Harmony model
model.load_state_dict(weights)
Conversion
These weights were converted from PyTorch .pth checkpoints using:
python scripts/convert_weights.py \
--input checkpoints/harmonizer/model.pth \
--output data/harmonizer.safetensors
License
MIT
Citation
@software{brainharmony_rs,
title = {brainharmony-rs: Brain-Harmony inference in Rust},
author = {Eugene Hauptmann},
url = {https://github.com/eugenehp/brainharmony-rs},
year = {2025}
}