Transformers How to use Fan21/Llama-mt-lora with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="Fan21/Llama-mt-lora") # Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Fan21/Llama-mt-lora")
model = AutoModelForCausalLM.from_pretrained("Fan21/Llama-mt-lora")