Instructions to use BueormLLC/CleanGPT with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use BueormLLC/CleanGPT with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="BueormLLC/CleanGPT")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("BueormLLC/CleanGPT") model = AutoModel.from_pretrained("BueormLLC/CleanGPT") - Notebooks
- Google Colab
- Kaggle
# Load model directly
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("BueormLLC/CleanGPT")
model = AutoModel.from_pretrained("BueormLLC/CleanGPT")Quick Links
CleanGPT
This is a clean model based on the GPT-2 small architecture, this model does not have training, it is an untrained model.
why so?
A model with this form is a ready-made model that we can use at any time to train and work on it and not on GPT-2, which may be limited to its old training data, making it impossible to extract its greatest performance.
- Downloads last month
- 6
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="BueormLLC/CleanGPT")