How to use SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune")
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune") model = AutoModel.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune")
Hi!π This PR has a some additional information for the model card, based on the format we are using as part of our effort to standardise model cards at Hugging Face. Feel free to merge if you are ok with the changes! (cc @Marissa @Meg @Nazneen )
Β· Sign up or log in to comment