End of training
Browse files
README.md
CHANGED
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
18 |
|
19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
-
- Loss:
|
22 |
-
- Meteor Score: {'meteor': 0.
|
23 |
|
24 |
## Model description
|
25 |
|
@@ -42,8 +42,8 @@ The following hyperparameters were used during training:
|
|
42 |
- train_batch_size: 128
|
43 |
- eval_batch_size: 128
|
44 |
- seed: 42
|
45 |
-
- gradient_accumulation_steps:
|
46 |
-
- total_train_batch_size:
|
47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
48 |
- lr_scheduler_type: cosine
|
49 |
- lr_scheduler_warmup_steps: 500
|
@@ -52,48 +52,48 @@ The following hyperparameters were used during training:
|
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
-
| Training Loss | Epoch | Step | Validation Loss | Meteor Score
|
56 |
-
|
57 |
-
|
|
58 |
-
|
|
59 |
-
|
|
60 |
-
|
|
61 |
-
|
|
62 |
-
|
|
63 |
-
|
|
64 |
-
|
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
|
|
69 |
-
|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
|
|
78 |
-
|
|
79 |
-
|
|
80 |
-
|
|
81 |
-
|
|
82 |
-
|
|
83 |
-
|
|
84 |
-
|
|
85 |
-
|
|
86 |
-
|
|
87 |
-
|
|
88 |
-
|
|
89 |
-
|
|
90 |
-
|
|
91 |
-
|
|
92 |
-
|
|
93 |
-
|
|
94 |
-
|
|
95 |
-
|
|
96 |
-
|
|
97 |
|
98 |
|
99 |
### Framework versions
|
|
|
18 |
|
19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 2.3179
|
22 |
+
- Meteor Score: {'meteor': 0.5631808848167792}
|
23 |
|
24 |
## Model description
|
25 |
|
|
|
42 |
- train_batch_size: 128
|
43 |
- eval_batch_size: 128
|
44 |
- seed: 42
|
45 |
+
- gradient_accumulation_steps: 2
|
46 |
+
- total_train_batch_size: 256
|
47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
48 |
- lr_scheduler_type: cosine
|
49 |
- lr_scheduler_warmup_steps: 500
|
|
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Meteor Score |
|
56 |
+
|:-------------:|:-----:|:----:|:---------------:|:------------------------------:|
|
57 |
+
| 4.9891 | 5.0 | 5 | 4.8898 | {'meteor': 0.4977448904730416} |
|
58 |
+
| 4.9871 | 10.0 | 10 | 4.8871 | {'meteor': 0.4986783896823098} |
|
59 |
+
| 4.9807 | 15.0 | 15 | 4.8787 | {'meteor': 0.5039923089582024} |
|
60 |
+
| 4.9703 | 20.0 | 20 | 4.8695 | {'meteor': 0.5007229587447326} |
|
61 |
+
| 4.9552 | 25.0 | 25 | 4.8554 | {'meteor': 0.5040915299837979} |
|
62 |
+
| 4.937 | 30.0 | 30 | 4.8380 | {'meteor': 0.5048006043123706} |
|
63 |
+
| 4.9144 | 35.0 | 35 | 4.8172 | {'meteor': 0.5053354754203138} |
|
64 |
+
| 4.8884 | 40.0 | 40 | 4.7941 | {'meteor': 0.5052519582427984} |
|
65 |
+
| 4.8595 | 45.0 | 45 | 4.7648 | {'meteor': 0.5155678605148059} |
|
66 |
+
| 4.8273 | 50.0 | 50 | 4.7351 | {'meteor': 0.5141758460997891} |
|
67 |
+
| 4.7906 | 55.0 | 55 | 4.7006 | {'meteor': 0.5161210615484346} |
|
68 |
+
| 4.7514 | 60.0 | 60 | 4.6661 | {'meteor': 0.5227861070655556} |
|
69 |
+
| 4.7102 | 65.0 | 65 | 4.6277 | {'meteor': 0.5254243476755014} |
|
70 |
+
| 4.6657 | 70.0 | 70 | 4.5848 | {'meteor': 0.5268576504702123} |
|
71 |
+
| 4.6174 | 75.0 | 75 | 4.5400 | {'meteor': 0.5327203741360704} |
|
72 |
+
| 4.5668 | 80.0 | 80 | 4.4927 | {'meteor': 0.5328385720589747} |
|
73 |
+
| 4.5133 | 85.0 | 85 | 4.4344 | {'meteor': 0.5379080722700745} |
|
74 |
+
| 4.4563 | 90.0 | 90 | 4.3847 | {'meteor': 0.5351712288811252} |
|
75 |
+
| 4.3961 | 95.0 | 95 | 4.3223 | {'meteor': 0.5459124584203718} |
|
76 |
+
| 4.3328 | 100.0 | 100 | 4.2619 | {'meteor': 0.5450854025913955} |
|
77 |
+
| 4.2664 | 105.0 | 105 | 4.1984 | {'meteor': 0.5396980345283648} |
|
78 |
+
| 4.1973 | 110.0 | 110 | 4.1254 | {'meteor': 0.5436701507478281} |
|
79 |
+
| 4.1255 | 115.0 | 115 | 4.0597 | {'meteor': 0.5443380961541314} |
|
80 |
+
| 4.0497 | 120.0 | 120 | 3.9792 | {'meteor': 0.5501741486578466} |
|
81 |
+
| 3.9694 | 125.0 | 125 | 3.9008 | {'meteor': 0.549708872514102} |
|
82 |
+
| 3.8871 | 130.0 | 130 | 3.8238 | {'meteor': 0.548104982356801} |
|
83 |
+
| 3.8014 | 135.0 | 135 | 3.7364 | {'meteor': 0.5554349932565801} |
|
84 |
+
| 3.7128 | 140.0 | 140 | 3.6483 | {'meteor': 0.55742212703008} |
|
85 |
+
| 3.6206 | 145.0 | 145 | 3.5535 | {'meteor': 0.554548892112528} |
|
86 |
+
| 3.5247 | 150.0 | 150 | 3.4614 | {'meteor': 0.5577247301342534} |
|
87 |
+
| 3.4264 | 155.0 | 155 | 3.3604 | {'meteor': 0.5567178684938024} |
|
88 |
+
| 3.3247 | 160.0 | 160 | 3.2577 | {'meteor': 0.5542593803491448} |
|
89 |
+
| 3.2182 | 165.0 | 165 | 3.1494 | {'meteor': 0.5517431992593748} |
|
90 |
+
| 3.1093 | 170.0 | 170 | 3.0413 | {'meteor': 0.5609162269594461} |
|
91 |
+
| 2.997 | 175.0 | 175 | 2.9274 | {'meteor': 0.5595566101373656} |
|
92 |
+
| 2.8804 | 180.0 | 180 | 2.8130 | {'meteor': 0.5594850153568144} |
|
93 |
+
| 2.7608 | 185.0 | 185 | 2.6924 | {'meteor': 0.5605989110400809} |
|
94 |
+
| 2.6389 | 190.0 | 190 | 2.5684 | {'meteor': 0.5608506949746257} |
|
95 |
+
| 2.5144 | 195.0 | 195 | 2.4447 | {'meteor': 0.5655230759592001} |
|
96 |
+
| 2.3862 | 200.0 | 200 | 2.3179 | {'meteor': 0.5631808848167792} |
|
97 |
|
98 |
|
99 |
### Framework versions
|