Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
sugatoray 's Collections
Papers + RL/Reasoning
Marimo
RLMs (Reasoning Language Models)
Books And Notes
Reasoning Datasets
SmolAgents Tools (Spaces)
Document AI
Bookmark::Models
LLMs
LLM Tools
AV LLMs
LLM Training Datasets
Papers
Leaderboards 🔥
Papers-MoE
Papers-LLMEval
LLM LLAMA3
Papers-Fundamentals
TFM: TimeSeries Foundation Models
Papers-Benchmarks
LLMs-EmbeddingModels
LLMs + Mamba
LLM + Datasets : Finance

Papers-MoE

updated Apr 8, 2024

Papers on Mixture of Experts (MoE)

Upvote
1

  • Branch-Train-MiX: Mixing Expert LLMs into a Mixture-of-Experts LLM

    Paper • 2403.07816 • Published Mar 12, 2024 • 42

  • OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models

    Paper • 2402.01739 • Published Jan 29, 2024 • 29

  • MoE-LLaVA: Mixture of Experts for Large Vision-Language Models

    Paper • 2401.15947 • Published Jan 29, 2024 • 53

  • Mixture-of-LoRAs: An Efficient Multitask Tuning for Large Language Models

    Paper • 2403.03432 • Published Mar 6, 2024 • 1
Upvote
1
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs