Question Answering
Transformers
PyTorch
TensorFlow
JAX
Vietnamese
t5
text2text-generation
summarization
translation
text-generation-inference
Instructions to use VietAI/vit5-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use VietAI/vit5-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="VietAI/vit5-base")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("VietAI/vit5-base") model = AutoModelForSeq2SeqLM.from_pretrained("VietAI/vit5-base") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit History
Update config.json ffd2f77
Update README.md 1c12345
Fix empty input a252ac1
root commited on
Update README.md 8bba306
Update README.md 25dc19d
Update README.md 1e7647a
Update TF and Flax model 04d9605
root commited on
Update model 838f57a
root commited on
Update README.md f93ae9e
Upload tokenizer.json f8bef15
Update config.json 3d7934c
Update README.md 4e225eb
Update README.md be7524f
Update README.md 82ff163
Update README.md 723f4f0
Update README.md 56ad7aa
Update README.md fef6c2a
Create README.md e49d9bf
Add flax model 931b33a
root commited on
Add vit5 base model 72d0543
root commited on