Instructions to use RaagulQB/OpenELM_1_1_B_code_lora_16_32 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use RaagulQB/OpenELM_1_1_B_code_lora_16_32 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("RaagulQB/OpenELM_1_1_B_code_lora_16_32", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Ctrl+K