Instructions to use EmotiScan/amazon-comments-bert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use EmotiScan/amazon-comments-bert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="EmotiScan/amazon-comments-bert")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("EmotiScan/amazon-comments-bert") model = AutoModelForSequenceClassification.from_pretrained("EmotiScan/amazon-comments-bert") - Notebooks
- Google Colab
- Kaggle
Upload 2 files
Browse filesThis BERT model is based on google/bert-large-uncased model.
emoticon-model.bin/config.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:6338a2a0ab2325727af23f0afce92f594a30d4425240cbb2bf32a789c1b129bb
|
| 3 |
+
size 886
|
emoticon-model.bin/model.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:61e4662dd1c291c7b027f7a0b341c2173cc51619acc5aa44128e7c93a7379ff8
|
| 3 |
+
size 735971416
|