How to use RaagulQB/OpenELM_1_1_B_code_lora_32_64 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("RaagulQB/OpenELM_1_1_B_code_lora_32_64", dtype="auto")
The community tab is the place to discuss and collaborate with the HF community!