Instructions to use mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- dbe64f3d0e5de99654095d639bced23ab20d0a11491b7f03d2840334311b1850
- Size of remote file:
- 23.3 GB
- SHA256:
- d256d490c1c96daa65f502f1e8980834234132b99e0f3c7b3469bdaa3dfdb8b3
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.