File size: 911 Bytes
778a985 88133fd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
## Model
This model is a fine-tuned version of [BigCode/SantaCoder](https://huggingface.co/bigcode/santacoder) on the Ruby portion of [The Stack](https://huggingface.co/datasets/bigcode/the-stack-dedup).
## Training
This model was trained using character-level FIM with [this script](https://github.com/Stillerman/santacoder-finetuning) invoked like this
```
train.py --model_path=bigcode/santacoder --dataset_name=bigcode/the-stack-dedup \
--subset=data/ruby --data_column content --split=train \
--seq_length 2048 --max_steps 4000 --batch_size 3 \
--gradient_accumulation_steps 8 --learning_rate 5e-5 \
--num_warmup_steps 500 --eval_freq 1000 --save_freq 1000 \
--log_freq 1 --num_workers=12 --no_fp16 --streaming \
--fim_rate=0.5 --fim_spm_rate=0.5
```
on a 40GB A100 for 48 hours.
## Performance
[MultiPL-E](https://nuprl.github.io/MultiPL-E/) HumanEval Ruby
- pass@1 = 0.10
- pass@10 = 0.14 |