calculator_model_test / tokenizer_config.json
macbeeth's picture
Training in progress, step 240
672f28d verified
raw
history blame contribute delete
199 Bytes
{
"backend": "tokenizers",
"cls_token": "[CLS]",
"eos_token": "[EOS]",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "[PAD]",
"tokenizer_class": "TokenizersBackend"
}