Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

teamcore
/
DPO_Py70M_U0_beta0.10

Model card Files Files and versions
xet
Community
DPO_Py70M_U0_beta0.10
1.52 kB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 1 commit
vermashresth's picture
vermashresth
initial commit
590ea48 verified 11 months ago
  • .gitattributes
    1.52 kB
    initial commit 11 months ago