The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
powershell-docs — rlat knowledge model (v2.0)
A Resonance Lattice
knowledge model of MicrosoftDocs/PowerShell-Docs at commit
58137d73, scope reference.
Quick start
pip install rlat
huggingface-cli download tenfingers/powershell-docs-rlat powershell-docs.rlat --local-dir .
rlat search powershell-docs.rlat "your question" --top-k 5
The model uses remote storage mode — passages reference source files at
raw.githubusercontent.com pinned to the commit SHA above. The first query
fetches each cited source once and caches it locally; subsequent queries on
the same passages are sub-20ms warm.
Build details
| Field | Value |
|---|---|
| Encoder | Alibaba-NLP/gte-modernbert-base 768d, CLS-pooled, L2-normalised |
| Encoder revision | e7f32e3c00f91d699e8c43b53106206bcc72bb22 (pinned) |
| Format | rlat knowledge-model v4 (ZIP + JSON + NPZ) |
| Storage mode | remote (source pinned at SHA, fetched on demand, SHA-verified) |
| Source repo | MicrosoftDocs/PowerShell-Docs |
| Source scope | reference |
| Source commit | 58137d73023e752c811cd327a91e03065844dc75 |
| Source branch | main (commit SHA-pinned; reproducible regardless of branch movement) |
| Files indexed | 2,647 |
| Passages | 107,033 |
| Build date | 2026-04-28 |
| Built on | Kaggle T4 (GPU encoding, batch_size=64, runtime=torch) |
| File size | 639.7 MB |
Usage
Single-hop search
rlat search powershell-docs.rlat "what does X do?" --top-k 5
Skill-context (Anthropic skill !command block)
!`rlat skill-context powershell-docs.rlat --query "$user_query" --top-k 5`
The output is markdown with citation anchors, drift status, and ConfidenceMetrics — ready for an LLM to ground on.
Multi-hop deep-search
rlat deep-search powershell-docs.rlat "harder cross-file question" --max-hops 3
Requires an Anthropic API key. See the deep-search docs.
Refreshing against upstream
This model pins to the source commit 58137d73. To re-index against the
current upstream tip:
# Option A: rebuild on Kaggle's free T4 (recommended for big corpora)
# See the rlat-build-on-kaggle skill at:
# https://github.com/tenfingerseddy/resonance-lattice/tree/main/.claude/skills/rlat-build-on-kaggle
# Option B: rebuild locally
pip install rlat[build,ann]
rlat install-encoder
git clone --depth 1 -b main https://github.com/MicrosoftDocs/PowerShell-Docs.git src/
rlat build src/reference -o powershell-docs.rlat \
--store-mode remote \
--remote-url-base https://raw.githubusercontent.com/MicrosoftDocs/PowerShell-Docs/<NEW_SHA>/reference/ \
--runtime torch
Honest limits
- The encoder is
gte-modernbert-base768d with no per-corpus optimisation. Default retrieval is dense cosine over the base band — single recipe, no rerankers, no lexical sidecar. - For per-corpus retrieval lift, you can run
rlat optimiselocally to add a 512d MRL-trained band on top of this archive (opt-in, costs API + GPU time). See docs/user/OPTIMISE.md. - Drift detection is automatic: if the source files at GitHub change, query
results show a
driftedstatus until the model is rebuilt against the new commit.
License
The rlat software is licensed under
BSL 1.1
(Business Source License — source-available; production use of the
licensed work is permitted up to the parameters in LICENSE.md).
This .rlat archive contains embeddings + metadata + a SHA-pinned URL
manifest; source bytes are NOT bundled and are fetched from upstream
GitHub at query time, where the upstream repository's license applies.
- Downloads last month
- 42