| --- |
| license: apache-2.0 |
| base_model: ibm-granite/granite-4.0-1b |
| language: |
| - en |
| tags: |
| - granite |
| - ibm |
| - full-finetune |
| - dual-gpu |
| - code |
| - reasoning |
| - text-generation |
| pipeline_tag: text-generation |
| datasets: |
| - Roman1111111/gpt-5.4-step-by-step-reasoning |
| - TeichAI/gpt-5.1-codex-max-1000x |
| - TeichAI/gpt-5.1-high-reasoning-1000x |
| --- |
| |
| # IBM-GPT-5.4-Coder-1B |
|
|
| This model is a full fine-tuned derivative of `ibm-granite/granite-4.0-1b`. |
|
|
| Training setup: |
| - Full model fine-tuning |
| - No adapters |
| - No LoRA |
| - No QLoRA |
| - Dual-GPU DDP training |
|
|