|
--- |
|
license: apache-2.0 |
|
base_model: Salesforce/codet5-small |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- rouge |
|
model-index: |
|
- name: codet5-small-generate-docstrings-codexglue-python-bs-32 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# codet5-small-generate-docstrings-codexglue-python-bs-32 |
|
|
|
This model is a fine-tuned version of [Salesforce/codet5-small](https://huggingface.co/Salesforce/codet5-small) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 2.0595 |
|
- Rouge1: 0.3535 |
|
- Rouge2: 0.1748 |
|
- Rougel: 0.3195 |
|
- Rougelsum: 0.336 |
|
- Gen Len: 16.1317 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 10 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |
|
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| |
|
| 2.6758 | 1.0 | 5952 | 2.2761 | 0.3458 | 0.1674 | 0.312 | 0.329 | 16.375 | |
|
| 2.4212 | 2.0 | 11904 | 2.1955 | 0.3471 | 0.1698 | 0.3143 | 0.33 | 15.3815 | |
|
| 2.3395 | 3.0 | 17856 | 2.1480 | 0.3501 | 0.1715 | 0.3159 | 0.3329 | 16.4806 | |
|
| 2.2889 | 4.0 | 23808 | 2.1198 | 0.3506 | 0.1722 | 0.3167 | 0.3333 | 15.974 | |
|
| 2.252 | 5.0 | 29760 | 2.0984 | 0.3524 | 0.1738 | 0.3184 | 0.3351 | 16.2197 | |
|
| 2.226 | 6.0 | 35712 | 2.0851 | 0.3518 | 0.1736 | 0.3182 | 0.3345 | 15.9882 | |
|
| 2.2047 | 7.0 | 41664 | 2.0746 | 0.3521 | 0.1738 | 0.3185 | 0.3348 | 16.2078 | |
|
| 2.1894 | 8.0 | 47616 | 2.0654 | 0.3532 | 0.1748 | 0.3194 | 0.3357 | 16.1077 | |
|
| 2.1798 | 9.0 | 53568 | 2.0617 | 0.3528 | 0.1749 | 0.3192 | 0.3355 | 16.0418 | |
|
| 2.1732 | 10.0 | 59520 | 2.0595 | 0.3535 | 0.1748 | 0.3195 | 0.336 | 16.1317 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.0.dev0 |
|
- Pytorch 2.0.1+cu118 |
|
- Datasets 2.14.4.dev0 |
|
- Tokenizers 0.13.3 |
|
|