mask2former-finetuned-ER-Mito-LD
This model is a fine-tuned version of facebook/mask2former-swin-base-IN21k-ade-semantic on the Dnq2025/Mask2former_Pretrain dataset. It achieves the following results on the evaluation set:
- Loss: 34.0481
- Dummy: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 6
- eval_batch_size: 6
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 12900
Training results
Training Loss | Epoch | Step | Validation Loss | Dummy |
---|---|---|---|---|
No log | 1.0 | 86 | 34.1100 | 1.0 |
43.7572 | 2.0 | 172 | 30.1643 | 1.0 |
32.0025 | 3.0 | 258 | 28.1818 | 1.0 |
26.8817 | 4.0 | 344 | 27.1780 | 1.0 |
24.1857 | 5.0 | 430 | 26.5827 | 1.0 |
22.9335 | 6.0 | 516 | 25.7310 | 1.0 |
21.3521 | 7.0 | 602 | 25.2442 | 1.0 |
21.3521 | 8.0 | 688 | 25.0511 | 1.0 |
20.4144 | 9.0 | 774 | 25.3838 | 1.0 |
18.8722 | 10.0 | 860 | 25.7261 | 1.0 |
18.576 | 11.0 | 946 | 25.0048 | 1.0 |
18.1119 | 12.0 | 1032 | 25.3630 | 1.0 |
17.8769 | 13.0 | 1118 | 25.2566 | 1.0 |
17.0204 | 14.0 | 1204 | 25.6023 | 1.0 |
17.0204 | 15.0 | 1290 | 26.3285 | 1.0 |
16.3528 | 16.0 | 1376 | 26.3254 | 1.0 |
16.5548 | 17.0 | 1462 | 26.9244 | 1.0 |
16.6848 | 18.0 | 1548 | 27.6294 | 1.0 |
15.4544 | 19.0 | 1634 | 25.7570 | 1.0 |
15.7209 | 20.0 | 1720 | 25.7097 | 1.0 |
15.3127 | 21.0 | 1806 | 27.2604 | 1.0 |
15.3127 | 22.0 | 1892 | 26.4286 | 1.0 |
14.9528 | 23.0 | 1978 | 27.5768 | 1.0 |
15.1795 | 24.0 | 2064 | 26.4714 | 1.0 |
14.707 | 25.0 | 2150 | 28.0977 | 1.0 |
14.3456 | 26.0 | 2236 | 26.7914 | 1.0 |
14.4534 | 27.0 | 2322 | 27.4079 | 1.0 |
14.4448 | 28.0 | 2408 | 26.8291 | 1.0 |
14.4448 | 29.0 | 2494 | 27.1506 | 1.0 |
14.0327 | 30.0 | 2580 | 27.1973 | 1.0 |
13.8785 | 31.0 | 2666 | 27.5062 | 1.0 |
14.3373 | 32.0 | 2752 | 27.9510 | 1.0 |
13.3176 | 33.0 | 2838 | 27.1878 | 1.0 |
13.8154 | 34.0 | 2924 | 25.5759 | 1.0 |
13.8962 | 35.0 | 3010 | 27.7627 | 1.0 |
13.8962 | 36.0 | 3096 | 28.8061 | 1.0 |
13.3858 | 37.0 | 3182 | 28.4328 | 1.0 |
12.9659 | 38.0 | 3268 | 27.5515 | 1.0 |
13.6813 | 39.0 | 3354 | 27.8206 | 1.0 |
13.3049 | 40.0 | 3440 | 28.6062 | 1.0 |
13.1584 | 41.0 | 3526 | 28.5364 | 1.0 |
12.9234 | 42.0 | 3612 | 29.3165 | 1.0 |
12.9234 | 43.0 | 3698 | 28.5156 | 1.0 |
13.1375 | 44.0 | 3784 | 28.2476 | 1.0 |
12.7875 | 45.0 | 3870 | 29.9959 | 1.0 |
12.6507 | 46.0 | 3956 | 28.5480 | 1.0 |
13.0131 | 47.0 | 4042 | 29.1117 | 1.0 |
12.3806 | 48.0 | 4128 | 31.2153 | 1.0 |
12.9016 | 49.0 | 4214 | 28.9405 | 1.0 |
12.274 | 50.0 | 4300 | 28.7396 | 1.0 |
12.274 | 51.0 | 4386 | 30.3948 | 1.0 |
12.5767 | 52.0 | 4472 | 29.3863 | 1.0 |
12.5965 | 53.0 | 4558 | 29.4516 | 1.0 |
11.9685 | 54.0 | 4644 | 27.2974 | 1.0 |
12.3025 | 55.0 | 4730 | 27.0013 | 1.0 |
12.4256 | 56.0 | 4816 | 27.2713 | 1.0 |
12.2008 | 57.0 | 4902 | 27.4054 | 1.0 |
12.2008 | 58.0 | 4988 | 27.9546 | 1.0 |
12.1018 | 59.0 | 5074 | 28.9453 | 1.0 |
12.2156 | 60.0 | 5160 | 29.3121 | 1.0 |
11.9526 | 61.0 | 5246 | 30.1903 | 1.0 |
12.1103 | 62.0 | 5332 | 28.8276 | 1.0 |
11.8017 | 63.0 | 5418 | 28.7898 | 1.0 |
11.9907 | 64.0 | 5504 | 28.6167 | 1.0 |
11.9907 | 65.0 | 5590 | 29.2822 | 1.0 |
11.6683 | 66.0 | 5676 | 31.4695 | 1.0 |
12.1544 | 67.0 | 5762 | 27.7773 | 1.0 |
11.7442 | 68.0 | 5848 | 29.5376 | 1.0 |
11.1493 | 69.0 | 5934 | 27.8916 | 1.0 |
12.0781 | 70.0 | 6020 | 28.4096 | 1.0 |
11.8055 | 71.0 | 6106 | 29.2272 | 1.0 |
11.8055 | 72.0 | 6192 | 29.2769 | 1.0 |
11.4811 | 73.0 | 6278 | 29.2552 | 1.0 |
11.5947 | 74.0 | 6364 | 29.2611 | 1.0 |
11.7263 | 75.0 | 6450 | 30.7953 | 1.0 |
11.7399 | 76.0 | 6536 | 30.0692 | 1.0 |
11.0851 | 77.0 | 6622 | 29.6803 | 1.0 |
11.5118 | 78.0 | 6708 | 30.7345 | 1.0 |
11.5118 | 79.0 | 6794 | 31.5980 | 1.0 |
11.516 | 80.0 | 6880 | 30.5279 | 1.0 |
11.3797 | 81.0 | 6966 | 30.2265 | 1.0 |
11.3335 | 82.0 | 7052 | 30.3816 | 1.0 |
11.2303 | 83.0 | 7138 | 29.3238 | 1.0 |
11.1964 | 84.0 | 7224 | 30.3987 | 1.0 |
11.321 | 85.0 | 7310 | 30.1935 | 1.0 |
11.321 | 86.0 | 7396 | 29.1421 | 1.0 |
11.3891 | 87.0 | 7482 | 31.2074 | 1.0 |
11.1347 | 88.0 | 7568 | 30.6735 | 1.0 |
11.1945 | 89.0 | 7654 | 31.2053 | 1.0 |
10.9891 | 90.0 | 7740 | 31.4373 | 1.0 |
11.104 | 91.0 | 7826 | 31.3946 | 1.0 |
11.1408 | 92.0 | 7912 | 31.2186 | 1.0 |
11.1408 | 93.0 | 7998 | 29.5871 | 1.0 |
11.0779 | 94.0 | 8084 | 30.4671 | 1.0 |
11.0551 | 95.0 | 8170 | 32.0130 | 1.0 |
10.8809 | 96.0 | 8256 | 30.4459 | 1.0 |
11.1123 | 97.0 | 8342 | 30.8415 | 1.0 |
10.7116 | 98.0 | 8428 | 31.0445 | 1.0 |
11.0086 | 99.0 | 8514 | 31.0471 | 1.0 |
11.0542 | 100.0 | 8600 | 31.0217 | 1.0 |
11.0542 | 101.0 | 8686 | 31.7885 | 1.0 |
10.8332 | 102.0 | 8772 | 30.6191 | 1.0 |
10.8696 | 103.0 | 8858 | 31.2075 | 1.0 |
10.6959 | 104.0 | 8944 | 32.0795 | 1.0 |
11.0688 | 105.0 | 9030 | 33.7820 | 1.0 |
10.6762 | 106.0 | 9116 | 31.9403 | 1.0 |
10.8607 | 107.0 | 9202 | 33.1345 | 1.0 |
10.8607 | 108.0 | 9288 | 31.0811 | 1.0 |
10.7504 | 109.0 | 9374 | 31.0663 | 1.0 |
10.7841 | 110.0 | 9460 | 30.0841 | 1.0 |
10.5677 | 111.0 | 9546 | 30.8185 | 1.0 |
11.0266 | 112.0 | 9632 | 32.1549 | 1.0 |
10.5912 | 113.0 | 9718 | 32.2208 | 1.0 |
10.6698 | 114.0 | 9804 | 31.5337 | 1.0 |
10.6698 | 115.0 | 9890 | 32.2273 | 1.0 |
10.6857 | 116.0 | 9976 | 31.8648 | 1.0 |
10.5977 | 117.0 | 10062 | 31.8058 | 1.0 |
10.6883 | 118.0 | 10148 | 31.7254 | 1.0 |
10.3506 | 119.0 | 10234 | 33.0298 | 1.0 |
10.9217 | 120.0 | 10320 | 33.3403 | 1.0 |
10.5332 | 121.0 | 10406 | 32.5384 | 1.0 |
10.5332 | 122.0 | 10492 | 32.2192 | 1.0 |
10.4658 | 123.0 | 10578 | 32.8913 | 1.0 |
10.4877 | 124.0 | 10664 | 33.1068 | 1.0 |
10.7404 | 125.0 | 10750 | 34.1187 | 1.0 |
10.2195 | 126.0 | 10836 | 32.4418 | 1.0 |
10.7622 | 127.0 | 10922 | 32.2935 | 1.0 |
10.4301 | 128.0 | 11008 | 33.2411 | 1.0 |
10.4301 | 129.0 | 11094 | 32.3692 | 1.0 |
10.6464 | 130.0 | 11180 | 32.6297 | 1.0 |
10.4213 | 131.0 | 11266 | 33.7513 | 1.0 |
10.382 | 132.0 | 11352 | 32.6382 | 1.0 |
10.6049 | 133.0 | 11438 | 33.2621 | 1.0 |
10.3039 | 134.0 | 11524 | 32.9468 | 1.0 |
10.3088 | 135.0 | 11610 | 33.4821 | 1.0 |
10.3088 | 136.0 | 11696 | 33.4824 | 1.0 |
10.4832 | 137.0 | 11782 | 32.9320 | 1.0 |
10.4149 | 138.0 | 11868 | 33.8853 | 1.0 |
10.2473 | 139.0 | 11954 | 33.5977 | 1.0 |
10.7137 | 140.0 | 12040 | 34.1817 | 1.0 |
10.2686 | 141.0 | 12126 | 34.0892 | 1.0 |
10.2581 | 142.0 | 12212 | 34.1113 | 1.0 |
10.2581 | 143.0 | 12298 | 33.9106 | 1.0 |
10.447 | 144.0 | 12384 | 33.3470 | 1.0 |
10.3823 | 145.0 | 12470 | 33.3055 | 1.0 |
10.1283 | 146.0 | 12556 | 33.6762 | 1.0 |
10.5364 | 147.0 | 12642 | 33.9977 | 1.0 |
10.1257 | 148.0 | 12728 | 34.0327 | 1.0 |
10.3092 | 149.0 | 12814 | 34.1170 | 1.0 |
10.4947 | 150.0 | 12900 | 33.8776 | 1.0 |
Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support