YAML Metadata Warning: empty or missing yaml metadata in repo card
Check out the documentation for more information.
HF-style model folder: hf_model
This folder was produced from a local checkpoint (gpt_decoder_checkpoint.pt) and contains a Hugging Face compatible model and tokenizer so you can load it with transformers locally.
Included files
config.jsonโ model configuration (GPT-2 compatible fields:vocab_size,n_positions,n_ctx,n_embd,n_layer,n_head).pytorch_model.binortf_model.h5/model.safetensorsโ model weights (PyTorch).tokenizer.json,vocab.json,merges.txt,tokenizer_config.json,special_tokens_map.jsonโ tokenizer assets (GPT-2 tokenizer was reused and saved here).
Note: If you have a different tokenizer used during training, replace the tokenizer files in this folder with your original tokenizer files for best results.
Quick local usage
Load the model and tokenizer from this local folder with transformers:
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("./hf_model")
model = AutoModelForCausalLM.from_pretrained("./hf_model")
prompt = "Once upon a time"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Running the Gradio app in this repo
app.pyin the project is already configured to usemodel_name = "hf_model"(local folder). Run the app with the project's venv Python:
/path/to/your/project/.venv/bin/python app.py
# then open http://127.0.0.1:7860 in your browser
To create a public Gradio link, edit app.py and change demo.launch() to demo.launch(share=True) and restart the process.
Uploading to the Hugging Face Hub
If you want to publish this folder to the Hub, you can either:
- Use
huggingface_hubfrom Python (login required):
from huggingface_hub import create_repo, upload_folder
create_repo("pragsyy1729/decoder_shakespeare", exist_ok=True)
upload_folder(folder_path="hf_model", repo_id="pragsyy1729/decoder_shakespeare")
- Or use
git lfsand push the folder into a repository created on the Hub.
Caveats
- This folder was created by mapping and (when necessary) transposing tensors from a custom checkpoint into a
GPT2LMHeadModel. While the script attempted to match shapes automatically, verify generation quality. - If you trained with a custom tokenizer or architecture, prefer exporting the model using
model.save_pretrained()andtokenizer.save_pretrained()from the original training environment.
Contact
If something looks off when loading or generating, open an issue or message me with the exact error and I can help debug further.
- Downloads last month
- 1