Token Classification
Transformers
PyTorch
TensorFlow
JAX
ONNX
Safetensors
English
bert
Eval Results (legacy)
Instructions to use dslim/bert-large-NER with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use dslim/bert-large-NER with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="dslim/bert-large-NER")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("dslim/bert-large-NER") model = AutoModelForTokenClassification.from_pretrained("dslim/bert-large-NER") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 31f32ac5141649857e79120012da8e7a5faa796ac7663002cd6965194d44738b
- Size of remote file:
- 1.33 GB
- SHA256:
- d63d39520929bbeab16344157582be46f09a3ab43095b14f1b1586636a50f171
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.