Spaces:
Sleeping
Sleeping
| title: Coherent Compute Engine | |
| emoji: 🌖 | |
| colorFrom: pink | |
| colorTo: red | |
| sdk: gradio | |
| sdk_version: 6.2.0 | |
| app_file: app.py | |
| pinned: false | |
| license: other | |
| short_description: Live coherence + throughput benchmark (no precomputed result | |
| thumbnail: >- | |
| https://cdn-uploads.huggingface.co/production/uploads/685edcb04796127b024b4805/ | |
| # README.md — Coherent Compute Engine | |
| **Coherent Compute Engine** is a live benchmark Space that measures **real** compute throughput and stability for a coherent state update rule — with **no precomputed results** and **no estimates**. | |
| It’s designed to be understandable, verifiable, and brutally honest: | |
| - Everything is computed **now**, on the Space machine. | |
| - Baselines (Python loop + vectorised NumPy) are measured **live** on the **same machine**. | |
| - Results can be downloaded as a **receipt** (JSON) with a SHA-256 hash. | |
| ## What an “item” is | |
| **One item = one coherent update of `[Ψ, E, L]` per oscillator per step.** | |
| So: | |
| `items/sec = (N oscillators × steps) / elapsed_seconds` | |
| We report throughput in **billions of items/sec** (“B/s”). | |
| ## What this Space measures | |
| For the chosen oscillator count and step count, it reports: | |
| - **Throughput (B/s)**: billions of coherent updates per second | |
| - **Coherence (|C|)**: a stability proxy computed from a normalised dot product of sampled `Ψ` before/after | |
| - **Mean Energy**: bounded mean proxy from `E` in `[0, 1.5]` | |
| - **Elapsed Time (s)** | |
| - **Engine**: `numba` when available; otherwise `numpy` | |
| - **Verification baselines (optional)**: | |
| - **Baseline (Vectorised NumPy)** | |
| - **Baseline (Python loop, capped)** — safety-capped and subset-based to keep the Space responsive | |
| - **Speedup factors** vs those baselines | |
| ## Receipts: verification you can download | |
| Each run emits a small JSON “receipt” containing: | |
| - input settings (N, steps) | |
| - engine name | |
| - measured metrics | |
| - runtime info | |
| - **SHA-256 hash** of the canonical JSON | |
| This supports the “don’t trust it, verify it” approach. | |
| ## Why baselines exist (and why they’re not a contest) | |
| Baselines are **verification anchors**: | |
| - They show what “normal” Python looks like (slow floor) | |
| - They show what vectorised NumPy looks like (standard reference) | |
| - They show what the engine path achieved under the same rules | |
| No claims about beating GPUs or other systems. Just measured, reproducible data. | |
| ## Running locally | |
| ```bash | |
| pip install -r requirements.txt | |
| python app.py | |
| Safety rails | |
| To keep the Space stable: | |
| • oscillator count is clamped to a safe max | |
| • steps are clamped | |
| • Python loop baseline is time-capped and subset-based | |
| That ensures the Space stays responsive while still measuring real throughput. | |
| ⸻ | |
| Built by RFTSystems. | |
| Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference |