| | --- |
| | base_model: Pinkstack/PGAM-WIT-Conversational-3B-vLLM |
| | tags: |
| | - text-generation-inference |
| | - transformers |
| | - unsloth |
| | - qwen2 |
| | - trl |
| | - sft |
| | license: apache-2.0 |
| | language: |
| | - en |
| | pipeline_tag: text-generation |
| | --- |
| | |
| | # This is a base/testing model. It is recommended to be used for further fine tuning or training. |
| |
|
| | This model is, odd. Been trained on both Grok and hf ultrachat_200k datasets, it acts oddly but is interesting to mess around with. |
| | WIT - weird & interesting transformer |
| | |
| | # Uploaded model |
| | |
| | - **Developed by:** Pinkstack |
| | - **License:** apache-2.0 |
| | - **Finetuned from model :** Pinkstack/PGAM-WIT-Conversational-3B-vLLM (og version) |
| | |
| | This model was trained with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. |