Instructions to use allenai/multicite-multilabel-scibert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use allenai/multicite-multilabel-scibert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="allenai/multicite-multilabel-scibert")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("allenai/multicite-multilabel-scibert") model = AutoModelForSequenceClassification.from_pretrained("allenai/multicite-multilabel-scibert") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -6,4 +6,6 @@ tags:
|
|
| 6 |
license: mit
|
| 7 |
---
|
| 8 |
|
| 9 |
-
# MultiCite: Multi-label Citation Intent Classification with SciBERT (NAACL 2022)
|
|
|
|
|
|
|
|
|
| 6 |
license: mit
|
| 7 |
---
|
| 8 |
|
| 9 |
+
# MultiCite: Multi-label Citation Intent Classification with SciBERT (NAACL 2022)
|
| 10 |
+
|
| 11 |
+
This model has been trained on the data available here: https://github.com/allenai/multicite
|