OpenELM
Collection
A family of Open-source Efficient Language Models from Apple. • 11 items • Updated • 4
How to use mlx-community/OpenELM-1_1B-Instruct-4bit with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir OpenELM-1_1B-Instruct-4bit mlx-community/OpenELM-1_1B-Instruct-4bit
This model was converted to MLX format from apple/OpenELM-1_1B-instruct using mlx-lm version 0.10.0.
Refer to the original model card for more details on the model.
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/OpenELM-1_1B-instruct-4bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)
Quantized