| title: FrontEndasPromptEngineeringTest | |
| emoji: 💻 | |
| colorFrom: blue | |
| colorTo: purple | |
| sdk: docker | |
| pinned: false | |
| models: | |
| - stabilityai/stablelm-2-zephyr-1_6b | |
| Example of running llama.cpp (and by extension simple cpp) from python without pip package dependency issues | |
| Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference | |