Update README.md
Browse files
README.md
CHANGED
@@ -133,11 +133,11 @@ The model, NT-Java-1.1B, has been trained on publicly available datasets and com
|
|
133 |
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch)
|
134 |
|
135 |
# License
|
136 |
-
The model checkpoint and vocabulary file are licensed under the [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).Under the license, you must evaluate if your use case does not violate the use-case restriction under Attachment A of the License. Any modification of the model (finetuning or extended pre training) for further downstream task needs to be released under [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
|
137 |
# Citation
|
138 |
```
|
139 |
@article{li2023starcoder,
|
140 |
-
title={NARROW TRANSFORMER: STARCODER-BASED JAVA-LM FOR DESKTOP},
|
141 |
author={Kamalkumar Rathinasamy and Balaji A J and Rajab Ali Mondal and Ankush Kumar and Harshini K and Gagan Gayari and Sreenivasa Raghavan Karumboor Seshadri},
|
142 |
year={2024},
|
143 |
eprint={2305.06161},
|
|
|
133 |
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch)
|
134 |
|
135 |
# License
|
136 |
+
The model checkpoint and vocabulary file are licensed under the [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement). Under the license, you must evaluate if your use case does not violate the use-case restriction under Attachment A of the License. Any modification of the model (finetuning or extended pre training) for further downstream task needs to be released under [BigCode OpenRAIL-M v1](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
|
137 |
# Citation
|
138 |
```
|
139 |
@article{li2023starcoder,
|
140 |
+
title={NARROW TRANSFORMER: STARCODER-BASED JAVA-LM FOR DESKTOP},
|
141 |
author={Kamalkumar Rathinasamy and Balaji A J and Rajab Ali Mondal and Ankush Kumar and Harshini K and Gagan Gayari and Sreenivasa Raghavan Karumboor Seshadri},
|
142 |
year={2024},
|
143 |
eprint={2305.06161},
|