Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
30
Follow
AWS Inferentia and Trainium
161
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
653
main
optimum-neuron-cache
/
inference-cache-config
/
trn2
Commit History
use longer sequence length for llama3 on trn2
f8538f0
verified
dacorvo
HF Staff
commited on
Jan 27
add trn2 cached configs subdirectory
25e9ebd
dacorvo
HF Staff
commited on
Oct 20, 2025