Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
hash-map
/
got_tokenizer
like
0
sentence-transformers
English
License:
mit
Model card
Files
Files and versions
xet
Community
Use this model
main
got_tokenizer
1.24 MB
1 contributor
History:
3 commits
hash-map
Update README.md
d6a544f
verified
4 months ago
.gitattributes
Safe
1.52 kB
initial commit
4 months ago
README.md
71 Bytes
Update README.md
4 months ago
icefire_spm.model
743 kB
xet
Upload 3 files
4 months ago
icefire_spm.vocab
492 kB
Upload 3 files
4 months ago
usage.py
767 Bytes
Upload 3 files
4 months ago