YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
aurekai/model-memory
Public model memory archive for the Aurekai platform. Stores compiled model representations, SAE dictionaries, and semantic embeddings for zero-shot operational orchestration.
Overview
Model memory serves as the central knowledge base for Aurekai's runtime, enabling semantic querying and model-based decision making. This repository hosts:
- Compiled Model Binaries: Pre-compiled Aurekai model representations (
.akmodel,.bfmodel) - SAE Dictionaries: Sparse autoencoder dictionaries for model interpretability (
.aksae,.bfsae) - Semantic Embeddings: Cached embeddings for fast semantic search across operators
- Manifest Metadata: Aurekai and legacy Bonfyre format manifests
Quick Start
# Download latest model memory archive
curl -L https://huggingface.co/aurekai/model-memory/resolve/main/aurekai-model-memory-qwen3-8b-20260502.tar.gz -o model-memory.tar.gz
# Extract
tar -xzf model-memory.tar.gz
# Use with Aurekai runtime
export AUREKAI_MODEL_MEMORY=$(pwd)/model-memory
akai run <recipe> --model-cache --semantic-search
Format Specifications
Aurekai First Formats (.ak*)
.akmodel: Aurekai-native model compiled format- Used by Aurekai runtime for direct inference
- Optimized for semantic routing and operator selection
.aksae: Aurekai-native SAE dictionary format- Sparse autoencoder coefficients in Aurekai serialization
- Default SAE for model interpretability
.akfpqx: Aurekai-native FPQx alignment format- Feature-to-proxy quantization alignments
- Model-to-model alignment data
Legacy Bonfyre Formats (.bf*)
For backward compatibility, this repository includes legacy Bonfyre format equivalents:
.bfmodelโ Bonfyre binary model representation.bfsaeโ Bonfyre SAE dictionary format.bffpqxโ Bonfyre FPQx alignment format
Available Models
Qwen3 8B (qwen3-8b)
- Release: 2026-05-02
- Archive:
aurekai-model-memory-qwen3-8b-20260502.tar.gz - Size: See SHA256SUMS
- Formats:
qwen3-8b.akmodel+qwen3-8b.bfmodeldefault.aksae+default.bfsaeqwen3-to-llama3.akfpqx+qwen3-to-llama3.bffpqx
Integration with Aurekai
Environment Variables
export AUREKAI_MODEL_MEMORY=/path/to/model-memory
export AUREKAI_SAE_DEFAULT=model-memory/default.aksae
export AUREKAI_EMBEDDINGS_CACHE=/tmp/aurekai-embeddings
In Aurekai Config
{
"model_memory": {
"path": "./model-memory",
"formats": ["akmodel", "bfmodel"],
"sae_dicts": ["default.aksae", "default.bfsae"],
"fpqx_alignments": ["qwen3-to-llama3.akfpqx"]
}
}
Manifests
aurekai.manifest.json: Aurekai public manifest with SAE and FPQx inventorybonfyre.manifest.json: Legacy Bonfyre manifest for backward compatibility
Both manifests are included in each release and describe:
- Available models and their paths
- SAE dictionary mappings (Aurekai โ Legacy)
- FPQx alignment pairs for cross-model translation
- Operator compatibility and runtime requirements
Performance Notes
- Model memory archives are compressed with both gzip and zstd
- Use
.tar.zstfor faster decompression on supported systems - Recommended extraction to SSD for optimal semantic search performance
License
Licensed under the Aurekai Open Source License. See LICENSE in the main Aurekai repository.
Related
- Main Aurekai Repo: https://github.com/aurekai/aurekai
- SAE Dictionaries: https://huggingface.co/aurekai/sae-dictionaries
- FPQx Alignments: https://huggingface.co/aurekai/fpqx-alignments
- Semantic Cache Benchmarks: https://huggingface.co/aurekai/semantic-cache-bench
Support
For issues or questions:
- GitHub Discussions: https://github.com/aurekai/aurekai/discussions
- GitHub Issues: https://github.com/aurekai/aurekai/issues