rajabmondal commited on
Commit
dc4aef9
·
verified ·
1 Parent(s): 4e71283

update desc

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -47,7 +47,7 @@ Large code models require specialized hardware like GPUs for inference, highligh
47
  # pip install -q transformers
48
  from transformers import AutoModelForCausalLM, AutoTokenizer
49
 
50
- checkpoint = "infosys/javacoder-1b"
51
  device = "cuda" # for GPU usage or "cpu" for CPU usage
52
 
53
  tokenizer = AutoTokenizer.from_pretrained(checkpoint)
@@ -102,7 +102,7 @@ The model is licensed under the Apache license 2.0 license agreement. You can fi
102
  ```
103
  @article{li2023starcoder,
104
  title={JavaCoder: may the source be with you!},
105
- author={Raymond Li and Loubna Ben Allal and Yangtian Zi and Niklas Muennighoff and Denis Kocetkov and Chenghao Mou and Marc Marone and Christopher Akiki and Jia Li and Jenny Chim and Qian Liu and Evgenii Zheltonozhskii and Terry Yue Zhuo and Thomas Wang and Olivier Dehaene and Mishig Davaadorj and Joel Lamy-Poirier and João Monteiro and Oleh Shliazhko and Nicolas Gontier and Nicholas Meade and Armel Zebaze and Ming-Ho Yee and Logesh Kumar Umapathi and Jian Zhu and Benjamin Lipkin and Muhtasham Oblokulov and Zhiruo Wang and Rudra Murthy and Jason Stillerman and Siva Sankalp Patel and Dmitry Abulkhanov and Marco Zocca and Manan Dey and Zhihan Zhang and Nour Fahmy and Urvashi Bhattacharyya and Wenhao Yu and Swayam Singh and Sasha Luccioni and Paulo Villegas and Maxim Kunakov and Fedor Zhdanov and Manuel Romero and Tony Lee and Nadav Timor and Jennifer Ding and Claire Schlesinger and Hailey Schoelkopf and Jan Ebert and Tri Dao and Mayank Mishra and Alex Gu and Jennifer Robinson and Carolyn Jane Anderson and Brendan Dolan-Gavitt and Danish Contractor and Siva Reddy and Daniel Fried and Dzmitry Bahdanau and Yacine Jernite and Carlos Muñoz Ferrandis and Sean Hughes and Thomas Wolf and Arjun Guha and Leandro von Werra and Harm de Vries},
106
  year={2023},
107
  eprint={2305.06161},
108
  archivePrefix={arXiv},
 
47
  # pip install -q transformers
48
  from transformers import AutoModelForCausalLM, AutoTokenizer
49
 
50
+ checkpoint = "infosys/NT-Java-1.1B"
51
  device = "cuda" # for GPU usage or "cpu" for CPU usage
52
 
53
  tokenizer = AutoTokenizer.from_pretrained(checkpoint)
 
102
  ```
103
  @article{li2023starcoder,
104
  title={JavaCoder: may the source be with you!},
105
+ author={},
106
  year={2023},
107
  eprint={2305.06161},
108
  archivePrefix={arXiv},