How to use fasterinnerlooper/codeBERTa-csharp with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="fasterinnerlooper/codeBERTa-csharp")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("fasterinnerlooper/codeBERTa-csharp") model = AutoModelForMaskedLM.from_pretrained("fasterinnerlooper/codeBERTa-csharp")
How to fix it?