FiscalNote/billsum
Viewer • Updated • 23.5k • 18.2k • 54
How to use callaghanmt/billsum_model with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("callaghanmt/billsum_model")
model = AutoModelForSeq2SeqLM.from_pretrained("callaghanmt/billsum_model")This model is a fine-tuned version of t5-small on the billsum dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 62 | 2.8082 | 0.1268 | 0.0352 | 0.1057 | 0.1057 | 19.0 |
| No log | 2.0 | 124 | 2.5861 | 0.1332 | 0.0427 | 0.1109 | 0.1108 | 19.0 |
| No log | 3.0 | 186 | 2.5232 | 0.1367 | 0.0476 | 0.1151 | 0.115 | 19.0 |
| No log | 4.0 | 248 | 2.5059 | 0.1366 | 0.047 | 0.1147 | 0.1147 | 19.0 |
Base model
google-t5/t5-small