Duplicated from mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile
How to use Nhines/WizardCoder-Python-34B-V1.0-llamafile with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Nhines/WizardCoder-Python-34B-V1.0-llamafile", dtype="auto")
11a93ac
1
2
3
{ "model_type": "llama" }