Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Heiifmx 's Collections
Bsbdhdnn

Bsbdhdnn

updated May 12, 2024
Upvote
-

  • Is DPO Superior to PPO for LLM Alignment? A Comprehensive Study

    Paper • 2404.10719 • Published Apr 16, 2024 • 5
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs