segformer-b0-harimehta-foc-feb21
This model is a fine-tuned version of nvidia/mit-b0 on the hari000/foc-segmentation dataset. It achieves the following results on the evaluation set:
- Loss: 0.0202
- Mean Iou: 0.2651
- Mean Accuracy: 0.5301
- Overall Accuracy: 0.5301
- Accuracy Cask: nan
- Accuracy Foc: 0.5301
- Iou Cask: 0.0
- Iou Foc: 0.5301
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 45
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Cask | Accuracy Foc | Iou Cask | Iou Foc |
---|---|---|---|---|---|---|---|---|---|---|
0.5516 | 0.625 | 20 | 0.5585 | 0.0794 | 0.1588 | 0.1588 | nan | 0.1588 | 0.0 | 0.1588 |
0.4061 | 1.25 | 40 | 0.3763 | 0.0441 | 0.0882 | 0.0882 | nan | 0.0882 | 0.0 | 0.0882 |
0.3363 | 1.875 | 60 | 0.2669 | 0.0023 | 0.0046 | 0.0046 | nan | 0.0046 | 0.0 | 0.0046 |
0.2396 | 2.5 | 80 | 0.2026 | 0.0010 | 0.0021 | 0.0021 | nan | 0.0021 | 0.0 | 0.0021 |
0.1909 | 3.125 | 100 | 0.1579 | 0.0002 | 0.0004 | 0.0004 | nan | 0.0004 | 0.0 | 0.0004 |
0.1587 | 3.75 | 120 | 0.1315 | 0.0002 | 0.0003 | 0.0003 | nan | 0.0003 | 0.0 | 0.0003 |
0.136 | 4.375 | 140 | 0.1181 | 0.0001 | 0.0001 | 0.0001 | nan | 0.0001 | 0.0 | 0.0001 |
0.1239 | 5.0 | 160 | 0.0908 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
0.0973 | 5.625 | 180 | 0.0839 | 0.0352 | 0.0704 | 0.0704 | nan | 0.0704 | 0.0 | 0.0704 |
0.093 | 6.25 | 200 | 0.0745 | 0.0366 | 0.0732 | 0.0732 | nan | 0.0732 | 0.0 | 0.0732 |
0.0757 | 6.875 | 220 | 0.0672 | 0.0013 | 0.0025 | 0.0025 | nan | 0.0025 | 0.0 | 0.0025 |
0.0634 | 7.5 | 240 | 0.0628 | 0.0281 | 0.0562 | 0.0562 | nan | 0.0562 | 0.0 | 0.0562 |
0.0762 | 8.125 | 260 | 0.0573 | 0.0337 | 0.0673 | 0.0673 | nan | 0.0673 | 0.0 | 0.0673 |
0.0512 | 8.75 | 280 | 0.0545 | 0.0317 | 0.0634 | 0.0634 | nan | 0.0634 | 0.0 | 0.0634 |
0.0542 | 9.375 | 300 | 0.0535 | 0.0727 | 0.1454 | 0.1454 | nan | 0.1454 | 0.0 | 0.1454 |
0.0489 | 10.0 | 320 | 0.0496 | 0.0583 | 0.1166 | 0.1166 | nan | 0.1166 | 0.0 | 0.1166 |
0.0361 | 10.625 | 340 | 0.0441 | 0.0586 | 0.1173 | 0.1173 | nan | 0.1173 | 0.0 | 0.1173 |
0.0366 | 11.25 | 360 | 0.0417 | 0.0551 | 0.1102 | 0.1102 | nan | 0.1102 | 0.0 | 0.1102 |
0.05 | 11.875 | 380 | 0.0416 | 0.0692 | 0.1383 | 0.1383 | nan | 0.1383 | 0.0 | 0.1383 |
0.04 | 12.5 | 400 | 0.0393 | 0.0748 | 0.1495 | 0.1495 | nan | 0.1495 | 0.0 | 0.1495 |
0.0321 | 13.125 | 420 | 0.0369 | 0.1059 | 0.2118 | 0.2118 | nan | 0.2118 | 0.0 | 0.2118 |
0.0355 | 13.75 | 440 | 0.0365 | 0.0923 | 0.1846 | 0.1846 | nan | 0.1846 | 0.0 | 0.1846 |
0.0296 | 14.375 | 460 | 0.0374 | 0.1721 | 0.3443 | 0.3443 | nan | 0.3443 | 0.0 | 0.3443 |
0.0395 | 15.0 | 480 | 0.0338 | 0.0990 | 0.1980 | 0.1980 | nan | 0.1980 | 0.0 | 0.1980 |
0.0294 | 15.625 | 500 | 0.0322 | 0.1325 | 0.2650 | 0.2650 | nan | 0.2650 | 0.0 | 0.2650 |
0.034 | 16.25 | 520 | 0.0327 | 0.1611 | 0.3223 | 0.3223 | nan | 0.3223 | 0.0 | 0.3223 |
0.0339 | 16.875 | 540 | 0.0311 | 0.1840 | 0.3680 | 0.3680 | nan | 0.3680 | 0.0 | 0.3680 |
0.0365 | 17.5 | 560 | 0.0310 | 0.2204 | 0.4408 | 0.4408 | nan | 0.4408 | 0.0 | 0.4408 |
0.0185 | 18.125 | 580 | 0.0289 | 0.1810 | 0.3620 | 0.3620 | nan | 0.3620 | 0.0 | 0.3620 |
0.0382 | 18.75 | 600 | 0.0296 | 0.1902 | 0.3804 | 0.3804 | nan | 0.3804 | 0.0 | 0.3804 |
0.0213 | 19.375 | 620 | 0.0285 | 0.1498 | 0.2997 | 0.2997 | nan | 0.2997 | 0.0 | 0.2997 |
0.0225 | 20.0 | 640 | 0.0280 | 0.1862 | 0.3724 | 0.3724 | nan | 0.3724 | 0.0 | 0.3724 |
0.0212 | 20.625 | 660 | 0.0271 | 0.1821 | 0.3642 | 0.3642 | nan | 0.3642 | 0.0 | 0.3642 |
0.0292 | 21.25 | 680 | 0.0270 | 0.1965 | 0.3929 | 0.3929 | nan | 0.3929 | 0.0 | 0.3929 |
0.0253 | 21.875 | 700 | 0.0262 | 0.2048 | 0.4095 | 0.4095 | nan | 0.4095 | 0.0 | 0.4095 |
0.0274 | 22.5 | 720 | 0.0251 | 0.2201 | 0.4401 | 0.4401 | nan | 0.4401 | 0.0 | 0.4401 |
0.0259 | 23.125 | 740 | 0.0254 | 0.2410 | 0.4820 | 0.4820 | nan | 0.4820 | 0.0 | 0.4820 |
0.0211 | 23.75 | 760 | 0.0253 | 0.2554 | 0.5109 | 0.5109 | nan | 0.5109 | 0.0 | 0.5109 |
0.0172 | 24.375 | 780 | 0.0244 | 0.2268 | 0.4535 | 0.4535 | nan | 0.4535 | 0.0 | 0.4535 |
0.0229 | 25.0 | 800 | 0.0250 | 0.2352 | 0.4705 | 0.4705 | nan | 0.4705 | 0.0 | 0.4705 |
0.0189 | 25.625 | 820 | 0.0238 | 0.1980 | 0.3959 | 0.3959 | nan | 0.3959 | 0.0 | 0.3959 |
0.0263 | 26.25 | 840 | 0.0240 | 0.1846 | 0.3693 | 0.3693 | nan | 0.3693 | 0.0 | 0.3693 |
0.0155 | 26.875 | 860 | 0.0237 | 0.2297 | 0.4594 | 0.4594 | nan | 0.4594 | 0.0 | 0.4594 |
0.0229 | 27.5 | 880 | 0.0228 | 0.2472 | 0.4945 | 0.4945 | nan | 0.4945 | 0.0 | 0.4945 |
0.0205 | 28.125 | 900 | 0.0232 | 0.2453 | 0.4906 | 0.4906 | nan | 0.4906 | 0.0 | 0.4906 |
0.0193 | 28.75 | 920 | 0.0229 | 0.2524 | 0.5048 | 0.5048 | nan | 0.5048 | 0.0 | 0.5048 |
0.0222 | 29.375 | 940 | 0.0224 | 0.2806 | 0.5613 | 0.5613 | nan | 0.5613 | 0.0 | 0.5613 |
0.0231 | 30.0 | 960 | 0.0222 | 0.2717 | 0.5434 | 0.5434 | nan | 0.5434 | 0.0 | 0.5434 |
0.0224 | 30.625 | 980 | 0.0220 | 0.2446 | 0.4892 | 0.4892 | nan | 0.4892 | 0.0 | 0.4892 |
0.0269 | 31.25 | 1000 | 0.0220 | 0.2588 | 0.5176 | 0.5176 | nan | 0.5176 | 0.0 | 0.5176 |
0.0229 | 31.875 | 1020 | 0.0227 | 0.2735 | 0.5469 | 0.5469 | nan | 0.5469 | 0.0 | 0.5469 |
0.0211 | 32.5 | 1040 | 0.0218 | 0.2326 | 0.4652 | 0.4652 | nan | 0.4652 | 0.0 | 0.4652 |
0.0212 | 33.125 | 1060 | 0.0215 | 0.2690 | 0.5381 | 0.5381 | nan | 0.5381 | 0.0 | 0.5381 |
0.0197 | 33.75 | 1080 | 0.0213 | 0.2471 | 0.4943 | 0.4943 | nan | 0.4943 | 0.0 | 0.4943 |
0.0206 | 34.375 | 1100 | 0.0212 | 0.2534 | 0.5068 | 0.5068 | nan | 0.5068 | 0.0 | 0.5068 |
0.0216 | 35.0 | 1120 | 0.0214 | 0.2622 | 0.5245 | 0.5245 | nan | 0.5245 | 0.0 | 0.5245 |
0.0176 | 35.625 | 1140 | 0.0209 | 0.2574 | 0.5148 | 0.5148 | nan | 0.5148 | 0.0 | 0.5148 |
0.0158 | 36.25 | 1160 | 0.0209 | 0.2531 | 0.5062 | 0.5062 | nan | 0.5062 | 0.0 | 0.5062 |
0.0154 | 36.875 | 1180 | 0.0209 | 0.2457 | 0.4914 | 0.4914 | nan | 0.4914 | 0.0 | 0.4914 |
0.0117 | 37.5 | 1200 | 0.0207 | 0.2501 | 0.5003 | 0.5003 | nan | 0.5003 | 0.0 | 0.5003 |
0.0131 | 38.125 | 1220 | 0.0206 | 0.2701 | 0.5401 | 0.5401 | nan | 0.5401 | 0.0 | 0.5401 |
0.0216 | 38.75 | 1240 | 0.0207 | 0.2723 | 0.5446 | 0.5446 | nan | 0.5446 | 0.0 | 0.5446 |
0.0162 | 39.375 | 1260 | 0.0206 | 0.2534 | 0.5067 | 0.5067 | nan | 0.5067 | 0.0 | 0.5067 |
0.0237 | 40.0 | 1280 | 0.0206 | 0.2787 | 0.5573 | 0.5573 | nan | 0.5573 | 0.0 | 0.5573 |
0.0187 | 40.625 | 1300 | 0.0203 | 0.2649 | 0.5298 | 0.5298 | nan | 0.5298 | 0.0 | 0.5298 |
0.0136 | 41.25 | 1320 | 0.0203 | 0.2633 | 0.5266 | 0.5266 | nan | 0.5266 | 0.0 | 0.5266 |
0.015 | 41.875 | 1340 | 0.0204 | 0.2603 | 0.5207 | 0.5207 | nan | 0.5207 | 0.0 | 0.5207 |
0.0218 | 42.5 | 1360 | 0.0204 | 0.2685 | 0.5370 | 0.5370 | nan | 0.5370 | 0.0 | 0.5370 |
0.0236 | 43.125 | 1380 | 0.0201 | 0.2735 | 0.5471 | 0.5471 | nan | 0.5471 | 0.0 | 0.5471 |
0.0181 | 43.75 | 1400 | 0.0203 | 0.2609 | 0.5217 | 0.5217 | nan | 0.5217 | 0.0 | 0.5217 |
0.0103 | 44.375 | 1420 | 0.0201 | 0.2730 | 0.5460 | 0.5460 | nan | 0.5460 | 0.0 | 0.5460 |
0.0125 | 45.0 | 1440 | 0.0202 | 0.2651 | 0.5301 | 0.5301 | nan | 0.5301 | 0.0 | 0.5301 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for hari000/segformer-b0-harimehta-foc-feb21
Base model
nvidia/mit-b0