ModernBert
Collection
16 items • Updated • 2
How to use mlx-community/answerdotai-ModernBERT-base-bf16 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="mlx-community/answerdotai-ModernBERT-base-bf16") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("mlx-community/answerdotai-ModernBERT-base-bf16")
model = AutoModelForMaskedLM.from_pretrained("mlx-community/answerdotai-ModernBERT-base-bf16")How to use mlx-community/answerdotai-ModernBERT-base-bf16 with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir answerdotai-ModernBERT-base-bf16 mlx-community/answerdotai-ModernBERT-base-bf16
The Model mlx-community/answerdotai-ModernBERT-base-bf16 was converted to MLX format from answerdotai/ModernBERT-base using mlx-lm version 0.0.3.
pip install mlx-embeddings
from mlx_embeddings import load, generate
import mlx.core as mx
model, tokenizer = load("mlx-community/answerdotai-ModernBERT-base-bf16")
# For text embeddings
output = generate(model, processor, texts=["I like grapes", "I like fruits"])
embeddings = output.text_embeds # Normalized embeddings
# Compute dot product between normalized embeddings
similarity_matrix = mx.matmul(embeddings, embeddings.T)
print("Similarity matrix between texts:")
print(similarity_matrix)
Quantized