File size: 4,191 Bytes
cc693fa 961c047 cc693fa 851c12b cc693fa 851c12b cc693fa 961c047 cc693fa 961c047 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 | ---
license: other
license_name: bsl-1.1
license_link: https://github.com/tenfingerseddy/resonance-lattice/blob/main/LICENSE.md
tags:
- resonance-lattice
- rlat
- knowledge-model
- retrieval
language: en
---
# python-stdlib — rlat knowledge model (v2.0)
A [Resonance Lattice](https://github.com/tenfingerseddy/resonance-lattice)
knowledge model of [`python/cpython`](https://github.com/python/cpython) at commit
[`d2f506ae`](https://github.com/python/cpython/commit/d2f506ae07e0bc097039634a28cf85b5d804ef72), scope `Doc`.
## Quick start
```bash
pip install rlat
huggingface-cli download tenfingers/python-stdlib-rlat python-stdlib.rlat --local-dir .
rlat search python-stdlib.rlat "your question" --top-k 5
```
The model uses **remote storage mode** — passages reference source files at
`raw.githubusercontent.com` pinned to the commit SHA above. The first query
fetches each cited source once and caches it locally; subsequent queries on
the same passages are sub-20ms warm.
## Build details
| Field | Value |
|---|---|
| Encoder | `Alibaba-NLP/gte-modernbert-base` 768d, CLS-pooled, L2-normalised |
| Encoder revision | `e7f32e3c00f91d699e8c43b53106206bcc72bb22` (pinned) |
| Format | rlat knowledge-model v4 (ZIP + JSON + NPZ) |
| Storage mode | `remote` (source pinned at SHA, fetched on demand, SHA-verified) |
| Source repo | [`python/cpython`](https://github.com/python/cpython) |
| Source scope | `Doc` |
| Source commit | `d2f506ae07e0bc097039634a28cf85b5d804ef72` |
| Source branch | `main` (commit SHA-pinned; reproducible regardless of branch movement) |
| Files indexed | 617 |
| Passages | 49,179 |
| Build date | 2026-04-28 |
| Built on | Kaggle T4 (GPU encoding, batch_size=64, runtime=torch) |
| File size | 292.5 MB |
## Usage
### Single-hop search
```bash
rlat search python-stdlib.rlat "what does X do?" --top-k 5
```
### Skill-context (Anthropic skill `!command` block)
```markdown
!`rlat skill-context python-stdlib.rlat --query "$user_query" --top-k 5`
```
The output is markdown with citation anchors, drift status, and
ConfidenceMetrics — ready for an LLM to ground on.
### Multi-hop deep-search
```bash
rlat deep-search python-stdlib.rlat "harder cross-file question" --max-hops 3
```
Requires an Anthropic API key. See the
[deep-search docs](https://github.com/tenfingerseddy/resonance-lattice/blob/main/docs/user/CLI.md#rlat-deep-search).
## Refreshing against upstream
This model pins to the source commit `d2f506ae`. To re-index against the
current upstream tip:
```bash
# Option A: rebuild on Kaggle's free T4 (recommended for big corpora)
# See the rlat-build-on-kaggle skill at:
# https://github.com/tenfingerseddy/resonance-lattice/tree/main/.claude/skills/rlat-build-on-kaggle
# Option B: rebuild locally
pip install rlat[build,ann]
rlat install-encoder
git clone --depth 1 -b main https://github.com/python/cpython.git src/
rlat build src/Doc -o python-stdlib.rlat \
--store-mode remote \
--remote-url-base https://raw.githubusercontent.com/python/cpython/<NEW_SHA>/Doc/ \
--runtime torch
```
## Honest limits
- The encoder is `gte-modernbert-base` 768d with no per-corpus optimisation.
Default retrieval is dense cosine over the base band — single recipe, no
rerankers, no lexical sidecar.
- For per-corpus retrieval lift, you can run `rlat optimise` locally to add
a 512d MRL-trained band on top of this archive (opt-in, costs API + GPU
time). See [docs/user/OPTIMISE.md](https://github.com/tenfingerseddy/resonance-lattice/blob/main/docs/user/OPTIMISE.md).
- Drift detection is automatic: if the source files at GitHub change, query
results show a `drifted` status until the model is rebuilt against the
new commit.
## License
The `rlat` software is licensed under
[BSL 1.1](https://github.com/tenfingerseddy/resonance-lattice/blob/main/LICENSE.md)
(Business Source License — source-available; production use of the
licensed work is permitted up to the parameters in LICENSE.md).
This `.rlat` archive contains embeddings + metadata + a SHA-pinned URL
manifest; source bytes are NOT bundled and are fetched from upstream
GitHub at query time, where the upstream repository's license applies.
|