How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("feature-extraction", model="hf-internal-testing/remote_code_model_with_dots", trust_remote_code=True)
# Load model directly
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("hf-internal-testing/remote_code_model_with_dots", trust_remote_code=True)
model = AutoModel.from_pretrained("hf-internal-testing/remote_code_model_with_dots", trust_remote_code=True)
Quick Links

Tiny Qwen2ForCausalLM

This is a minimal model built for unit tests in the TRL library.

Downloads last month
88
Safetensors
Model size
2.43M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support