Instructions to use mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("mozilla-ai/WizardCoder-Python-34B-V1.0-llamafile", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 84c63e64bec8baceb121ecd8dddf8b572ca474a66a6fd3bec49f8974349574ee
- Size of remote file:
- 20.2 GB
- SHA256:
- 0669b466440912ecd7c3a9248a908c1b5fa4e97c664b00f172c1eecfe818e4aa
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.