twelveand0 commited on
Commit
7069796
·
verified ·
1 Parent(s): 9692a99

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -26,7 +26,7 @@ tags:
26
 
27
  ## Introduction
28
 
29
- Ling-Coder-Lite is a MoE LLM provided and open-sourced by InclusionAI, which has 16.8 billion parameters with 2.75 billion activated parameters. Ling-Coder-Lite performs impressively on coding tasks compared to existing models in the industry. Specifically, Ling-Coder-Lite further pre-training from an intermediate checkpoint of Ling-Lite, incorporating an additional 3 trillion tokens. This extended pre-training significantly boosts the coding abilities of Ling-Lite, while preserving its strong performance in general language tasks. This model is described in the paper [Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM](https://huggingface.co/papers/2503.17793).
30
 
31
  ## Model Downloads
32
 
 
26
 
27
  ## Introduction
28
 
29
+ Ling-Coder-Lite is a MoE LLM provided and open-sourced by InclusionAI, which has 16.8 billion parameters with 2.75 billion activated parameters. Ling-Coder-Lite performs impressively on coding tasks compared to existing models in the industry. Specifically, Ling-Coder-Lite further pre-training from an intermediate checkpoint of Ling-Lite, incorporating an additional 3 trillion tokens. This extended pre-training significantly boosts the coding abilities of Ling-Lite, while preserving its strong performance in general language tasks. More details are described in the technique report [Ling-Coder-TR](https://huggingface.co/papers/2503.17793).
30
 
31
  ## Model Downloads
32