| --- |
| license: apache-2.0 |
| language: |
| - en |
| pipeline_tag: text-generation |
| tags: |
| - text-generation-inference |
| - code |
| - Hitesh_V_Founder |
| --- |
| <p align="center"> |
| <img src="https://i.imgur.com/ePJMLNp.png" alt="Hyze Logo" width="405"/> |
| </p> |
|
|
| <h1 align="center">HyzeMini</h1> |
|
|
| <p align="center"> |
| A lightweight text-generation model by <b>Hyze AI</b> |
| </p> |
|
|
| <p align="center"> |
| π <a href="https://hyzebot.vercel.app">hyzebot.vercel.app</a> β’ |
| π <a href="https://hyzedocs.vercel.app">hyzedocs.vercel.app</a> β’ |
| π§ <a href="https://hyzecode.vercel.app">hyzecode.vercel.app</a> |
| </p> |
|
|
| --- |
|
|
| ## π Overview |
|
|
| **HyzeMini** is a compact and efficient **text-generation transformer model** optimized for **Space & Astronomy knowledge** π and **General Chat** π¬. |
|
|
| Itβs designed to run fast on low-resource systems while still delivering clean, friendly, and useful responses. |
|
|
| - **Model type:** Transformer-based LLM |
| - **Parameters:** ~0.1B |
| - **Precision:** BF16 |
| - **Language:** English |
| - **License:** Apache-2.0 |
|
|
| --- |
|
|
| ## π§ Training Focus |
|
|
| HyzeMini was trained on a curated mixture of **publicly available English datasets**, with emphasis on: |
|
|
| - π **Space & Astronomy** |
| - Planets, stars, galaxies |
| - Rockets, missions, and space science |
| - Beginner to intermediate explanations |
|
|
| - π¬ **General Chat** |
| - Casual conversation |
| - Q&A-style prompts |
| - Friendly assistant tone |
|
|
| --- |
| ## π¬ About the Founder |
|
|
| Hyze AI was created by Hitesh Vinothkumar who is 12 years old. |
| Hyze focuses on learning, experimentation, and open access, blending software engineering with curiosity about the universe |
|
|
| --- |
|
|
| ## π Benchmarks (Qualitative Comparison) |
|
|
| HyzeMini focuses on **speed, coherence, and domain knowledge** rather than raw reasoning power. |
|
|
| | Model | Size | Strengths | Tradeoffs | |
| |-----|-----|---------|----------| |
| | **HyzeMini** | ~0.1B | Space-focused knowledge, fast, chat-friendly | Limited deep reasoning | |
| | **TinyLlama** | ~0.1B | Solid general generation | More generic responses | |
| | **GPT-Neo 125M** | ~0.125B | Better general reasoning | Slower, higher memory | |
| | **GPT-1** | ~0.117B | Historical baseline | Less coherent by modern standards | |
|
|
| ### Summary |
| - **Coherence:** HyzeMini β TinyLlama > GPT-1 |
| - **Space knowledge:** HyzeMini > TinyLlama / GPT-Neo (in-domain prompts) |
| - **Efficiency:** HyzeMini β TinyLlama > GPT-Neo |
|
|
| > Benchmarks are based on internal qualitative testing and comparisons. |
|
|
| --- |
|
|
| ## π§ͺ Usage |
|
|
| ### Transformers (Python) |
|
|
| ```python |
| from transformers import pipeline |
| |
| generator = pipeline( |
| "text-generation", |
| model="HyzeAI/HyzeMini" |
| ) |
| |
| print(generator("Tell me a cool space fact:")) |