Instructions to use formermagic/codet5-small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use formermagic/codet5-small with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("formermagic/codet5-small") model = AutoModelForSeq2SeqLM.from_pretrained("formermagic/codet5-small") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- f5096798171123c4c9aa35cde492b43594cea54988ee870fb6b438c6652ec208
- Size of remote file:
- 308 MB
- SHA256:
- fae4334844a797885199c8825dd910618dd07071b42ee24a52efb1c87d072add
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.