How to use Fardan/phi2-chat-adapter with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Fardan/phi2-chat-adapter", dtype="auto")
How to fix it?