Update README.md
Browse files
README.md
CHANGED
@@ -40,9 +40,9 @@ extra_gated_fields:
|
|
40 |
duplicated_from: bigcode-data/starcoderbase-1b
|
41 |
---
|
42 |
|
43 |
-
|
44 |
|
45 |
-
The Narrow Transformer (NT) model NT-Java-1.1B is an open-source specialized code model built by extending pre-training on StarCoderBase-1B, designed for coding tasks in Java programming. The model is a decoder-only transformer with Multi-Query Attention and with a context length of 8192 tokens. The model was trained with Java subset of the StarCoderData dataset, which is ~22B tokens.
|
46 |
|
47 |
- **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
|
48 |
- **Paper:**
|
|
|
40 |
duplicated_from: bigcode-data/starcoderbase-1b
|
41 |
---
|
42 |
|
43 |
+
# Model Summary
|
44 |
|
45 |
+
The Narrow Transformer (NT) model **NT-Java-1.1B** is an open-source specialized code model built by extending pre-training on StarCoderBase-1B, designed for coding tasks in Java programming. The model is a decoder-only transformer with Multi-Query Attention and with a context length of 8192 tokens. The model was trained with Java subset of the StarCoderData dataset, which is ~22B tokens.
|
46 |
|
47 |
- **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
|
48 |
- **Paper:**
|