segformer-b1-finetuned-segments-chargers-full-v2.1
This model is a fine-tuned version of nvidia/mit-b2 on the dskong07/chargers-full-v0.1 dataset. It achieves the following results on the evaluation set:
- Loss: 0.3562
- Mean Iou: 0.7932
- Mean Accuracy: 0.8741
- Overall Accuracy: 0.9202
- Accuracy Unlabeled: nan
- Accuracy Screen: 0.8965
- Accuracy Body: 0.9162
- Accuracy Cable: 0.7137
- Accuracy Plug: 0.9024
- Accuracy Void-background: 0.9416
- Iou Unlabeled: nan
- Iou Screen: 0.7980
- Iou Body: 0.7952
- Iou Cable: 0.6210
- Iou Plug: 0.8498
- Iou Void-background: 0.9017
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Screen | Accuracy Body | Accuracy Cable | Accuracy Plug | Accuracy Void-background | Iou Unlabeled | Iou Screen | Iou Body | Iou Cable | Iou Plug | Iou Void-background |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.791 | 2.2222 | 20 | 0.9137 | 0.6056 | 0.7582 | 0.8259 | nan | 0.7897 | 0.9011 | 0.3986 | 0.8781 | 0.8237 | nan | 0.5894 | 0.6325 | 0.3127 | 0.7018 | 0.7917 |
0.5745 | 4.4444 | 40 | 0.5314 | 0.6800 | 0.7953 | 0.8728 | nan | 0.8035 | 0.8500 | 0.5621 | 0.8461 | 0.9150 | nan | 0.6561 | 0.7012 | 0.4305 | 0.7589 | 0.8532 |
0.299 | 6.6667 | 60 | 0.5239 | 0.7206 | 0.8270 | 0.8901 | nan | 0.8754 | 0.8941 | 0.5805 | 0.8698 | 0.9153 | nan | 0.7133 | 0.7331 | 0.5038 | 0.7821 | 0.8707 |
0.2347 | 8.8889 | 80 | 0.4256 | 0.7400 | 0.8361 | 0.8968 | nan | 0.8764 | 0.8594 | 0.6402 | 0.8687 | 0.9361 | nan | 0.7360 | 0.7410 | 0.5458 | 0.8023 | 0.8750 |
0.2687 | 11.1111 | 100 | 0.4235 | 0.7527 | 0.8554 | 0.9026 | nan | 0.9298 | 0.8578 | 0.6598 | 0.8930 | 0.9367 | nan | 0.7357 | 0.7511 | 0.5652 | 0.8284 | 0.8833 |
0.1953 | 13.3333 | 120 | 0.4096 | 0.7602 | 0.8623 | 0.9033 | nan | 0.8765 | 0.8807 | 0.7199 | 0.9064 | 0.9280 | nan | 0.7666 | 0.7590 | 0.5933 | 0.8012 | 0.8808 |
0.1998 | 15.5556 | 140 | 0.3897 | 0.7644 | 0.8591 | 0.9076 | nan | 0.8287 | 0.9400 | 0.6962 | 0.9132 | 0.9173 | nan | 0.7548 | 0.7751 | 0.5922 | 0.8125 | 0.8874 |
0.1636 | 17.7778 | 160 | 0.3818 | 0.7810 | 0.8675 | 0.9140 | nan | 0.8802 | 0.9041 | 0.7048 | 0.9117 | 0.9368 | nan | 0.7889 | 0.7802 | 0.6022 | 0.8398 | 0.8939 |
0.1554 | 20.0 | 180 | 0.3784 | 0.7718 | 0.8826 | 0.9119 | nan | 0.9614 | 0.8719 | 0.7156 | 0.9292 | 0.9348 | nan | 0.7299 | 0.7721 | 0.6117 | 0.8482 | 0.8972 |
0.1187 | 22.2222 | 200 | 0.3712 | 0.7851 | 0.8735 | 0.9159 | nan | 0.8778 | 0.9223 | 0.7164 | 0.9201 | 0.9310 | nan | 0.7982 | 0.7885 | 0.6086 | 0.8347 | 0.8957 |
0.1217 | 24.4444 | 220 | 0.3630 | 0.7898 | 0.8841 | 0.9174 | nan | 0.9005 | 0.9160 | 0.7352 | 0.9387 | 0.9301 | nan | 0.8053 | 0.7934 | 0.6209 | 0.8326 | 0.8969 |
0.092 | 26.6667 | 240 | 0.3711 | 0.7882 | 0.8801 | 0.9184 | nan | 0.9210 | 0.9023 | 0.7309 | 0.9067 | 0.9398 | nan | 0.7766 | 0.7899 | 0.6158 | 0.8574 | 0.9013 |
0.0906 | 28.8889 | 260 | 0.3732 | 0.7907 | 0.8798 | 0.9189 | nan | 0.8996 | 0.9158 | 0.7235 | 0.9247 | 0.9354 | nan | 0.7915 | 0.7934 | 0.6219 | 0.8466 | 0.9002 |
0.1034 | 31.1111 | 280 | 0.3899 | 0.7924 | 0.8801 | 0.9198 | nan | 0.9053 | 0.9124 | 0.7297 | 0.9141 | 0.9389 | nan | 0.7878 | 0.7927 | 0.6203 | 0.8586 | 0.9025 |
0.084 | 33.3333 | 300 | 0.3613 | 0.7925 | 0.8792 | 0.9200 | nan | 0.9141 | 0.9073 | 0.7117 | 0.9221 | 0.9406 | nan | 0.7925 | 0.7934 | 0.6163 | 0.8584 | 0.9020 |
0.1075 | 35.5556 | 320 | 0.3712 | 0.7901 | 0.8808 | 0.9193 | nan | 0.9249 | 0.9068 | 0.7126 | 0.9216 | 0.9384 | nan | 0.7866 | 0.7938 | 0.6175 | 0.8513 | 0.9013 |
0.1099 | 37.7778 | 340 | 0.4036 | 0.7934 | 0.8823 | 0.9203 | nan | 0.9055 | 0.9211 | 0.7249 | 0.9251 | 0.9350 | nan | 0.7972 | 0.7982 | 0.6235 | 0.8465 | 0.9018 |
0.2037 | 40.0 | 360 | 0.3782 | 0.7894 | 0.8719 | 0.9188 | nan | 0.8953 | 0.9155 | 0.6951 | 0.9145 | 0.9393 | nan | 0.7988 | 0.7936 | 0.6117 | 0.8433 | 0.8997 |
0.103 | 42.2222 | 380 | 0.3521 | 0.7921 | 0.8788 | 0.9199 | nan | 0.8977 | 0.9211 | 0.7155 | 0.9234 | 0.9361 | nan | 0.7992 | 0.7983 | 0.6195 | 0.8428 | 0.9010 |
0.0741 | 44.4444 | 400 | 0.3763 | 0.7922 | 0.8818 | 0.9199 | nan | 0.9081 | 0.9241 | 0.7135 | 0.9304 | 0.9330 | nan | 0.7982 | 0.7992 | 0.6216 | 0.8414 | 0.9008 |
0.0831 | 46.6667 | 420 | 0.3501 | 0.7921 | 0.8748 | 0.9199 | nan | 0.8859 | 0.9229 | 0.7177 | 0.9094 | 0.9382 | nan | 0.7934 | 0.7957 | 0.6227 | 0.8467 | 0.9018 |
0.079 | 48.8889 | 440 | 0.3562 | 0.7932 | 0.8741 | 0.9202 | nan | 0.8965 | 0.9162 | 0.7137 | 0.9024 | 0.9416 | nan | 0.7980 | 0.7952 | 0.6210 | 0.8498 | 0.9017 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for irvingz/segformer-b2-finetuned-segments-chargers-full-v2.1
Base model
nvidia/mit-b2