Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
shivanandmn 's Collections
dpo-dataset
deployable-llm
Concept papers

dpo-dataset

updated Dec 31, 2024
Upvote
-

  • mlabonne/orpo-dpo-mix-40k

    Viewer • Updated Oct 17, 2024 • 44.2k • 1.94k • 281
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs