-
GAIA: a benchmark for General AI Assistants
Paper • 2311.12983 • Published • 207 -
Zephyr: Direct Distillation of LM Alignment
Paper • 2310.16944 • Published • 122 -
SmolLM2: When Smol Goes Big -- Data-Centric Training of a Small Language Model
Paper • 2502.02737 • Published • 229 -
Global MMLU: Understanding and Addressing Cultural and Linguistic Biases in Multilingual Evaluation
Paper • 2412.03304 • Published • 19
Collections
Discover the best community collections!
Collections including paper arxiv:2310.16944
-
A Mechanistic Understanding of Alignment Algorithms: A Case Study on DPO and Toxicity
Paper • 2401.01967 • Published -
Secrets of RLHF in Large Language Models Part I: PPO
Paper • 2307.04964 • Published • 29 -
Zephyr: Direct Distillation of LM Alignment
Paper • 2310.16944 • Published • 122 -
LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders
Paper • 2404.05961 • Published • 66
-
PERL: Parameter Efficient Reinforcement Learning from Human Feedback
Paper • 2403.10704 • Published • 60 -
ReFT: Representation Finetuning for Language Models
Paper • 2404.03592 • Published • 98 -
Ferret-v2: An Improved Baseline for Referring and Grounding with Large Language Models
Paper • 2404.07973 • Published • 33 -
Zephyr: Direct Distillation of LM Alignment
Paper • 2310.16944 • Published • 122
-
FinTral: A Family of GPT-4 Level Multimodal Financial Large Language Models
Paper • 2402.10986 • Published • 80 -
bigcode/starcoder2-15b
Text Generation • Updated • 8.27k • 604 -
Zephyr: Direct Distillation of LM Alignment
Paper • 2310.16944 • Published • 122 -
mixedbread-ai/mxbai-rerank-large-v1
Text Ranking • Updated • 77.2k • 129
-
Metadata Might Make Language Models Better
Paper • 2211.10086 • Published • 4 -
Empirical Analysis of the Strengths and Weaknesses of PEFT Techniques for LLMs
Paper • 2304.14999 • Published • 2 -
PEFT for Speech: Unveiling Optimal Placement, Merging Strategies, and Ensemble Techniques
Paper • 2401.02122 • Published • 2 -
Zephyr: Direct Distillation of LM Alignment
Paper • 2310.16944 • Published • 122