Instructions to use kd13/RoPERT-MLM-mini with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use kd13/RoPERT-MLM-mini with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="kd13/RoPERT-MLM-mini", trust_remote_code=True)# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("kd13/RoPERT-MLM-mini", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update modeling_mybert.py
Browse files- modeling_mybert.py +1 -1
modeling_mybert.py
CHANGED
|
@@ -272,4 +272,4 @@ class MyBertForMaskedLM(MyBertPreTrainedModel):
|
|
| 272 |
output = (prediction_scores,)
|
| 273 |
return ((loss,) + output) if loss is not None else output
|
| 274 |
|
| 275 |
-
return MaskedLMOutput(loss=loss, logits=prediction_scores)
|
|
|
|
| 272 |
output = (prediction_scores,)
|
| 273 |
return ((loss,) + output) if loss is not None else output
|
| 274 |
|
| 275 |
+
return MaskedLMOutput(loss=loss, logits=prediction_scores)
|