detr_finetuned_kitti_mots-noaug-good-1
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.3675
- Map: 0.2538
- Map 50: 0.5322
- Map 75: 0.2143
- Map Small: 0.0306
- Map Medium: 0.2748
- Map Large: 0.5189
- Mar 1: 0.1217
- Mar 10: 0.3156
- Mar 100: 0.392
- Mar Small: 0.1285
- Mar Medium: 0.4301
- Mar Large: 0.6443
- Map Pedestrian: 0.1681
- Mar 100 Pedestrian: 0.3351
- Map Ignore: -1.0
- Mar 100 Ignore: -1.0
- Map Car: 0.3395
- Mar 100 Car: 0.4489
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Pedestrian | Mar 100 Pedestrian | Map Ignore | Mar 100 Ignore | Map Car | Mar 100 Car |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.177 | 1.0 | 625 | 1.4322 | 0.1364 | 0.3647 | 0.0647 | 0.0114 | 0.1339 | 0.3615 | 0.0821 | 0.2247 | 0.3215 | 0.0989 | 0.3222 | 0.607 | 0.0607 | 0.2792 | -1.0 | -1.0 | 0.212 | 0.3638 |
1.0939 | 2.0 | 1250 | 1.4303 | 0.1575 | 0.3775 | 0.1093 | 0.0207 | 0.1468 | 0.3678 | 0.0871 | 0.2277 | 0.3194 | 0.1119 | 0.34 | 0.5426 | 0.0689 | 0.2749 | -1.0 | -1.0 | 0.2462 | 0.3639 |
1.0674 | 3.0 | 1875 | 1.3514 | 0.1811 | 0.4065 | 0.1345 | 0.017 | 0.1875 | 0.4097 | 0.0991 | 0.2603 | 0.3511 | 0.1063 | 0.3717 | 0.62 | 0.0758 | 0.2779 | -1.0 | -1.0 | 0.2863 | 0.4244 |
0.9822 | 4.0 | 2500 | 1.3675 | 0.1708 | 0.4027 | 0.1232 | 0.0166 | 0.1821 | 0.3745 | 0.0962 | 0.2523 | 0.346 | 0.1167 | 0.3623 | 0.6005 | 0.0775 | 0.2844 | -1.0 | -1.0 | 0.2642 | 0.4077 |
0.9495 | 5.0 | 3125 | 1.4739 | 0.1647 | 0.397 | 0.0956 | 0.0217 | 0.1599 | 0.4014 | 0.0921 | 0.2457 | 0.3271 | 0.0961 | 0.3422 | 0.5966 | 0.0769 | 0.2502 | -1.0 | -1.0 | 0.2525 | 0.4041 |
0.9387 | 6.0 | 3750 | 1.3182 | 0.194 | 0.4233 | 0.1596 | 0.0171 | 0.213 | 0.3984 | 0.1086 | 0.2704 | 0.3659 | 0.1105 | 0.4008 | 0.6124 | 0.0902 | 0.3083 | -1.0 | -1.0 | 0.2979 | 0.4235 |
0.9332 | 7.0 | 4375 | 1.3399 | 0.2021 | 0.4429 | 0.1508 | 0.0154 | 0.2055 | 0.4387 | 0.1071 | 0.2667 | 0.3501 | 0.111 | 0.3784 | 0.5978 | 0.1076 | 0.2919 | -1.0 | -1.0 | 0.2966 | 0.4083 |
0.8834 | 8.0 | 5000 | 1.3403 | 0.21 | 0.4456 | 0.1741 | 0.0216 | 0.2185 | 0.4446 | 0.1094 | 0.2836 | 0.3747 | 0.117 | 0.4075 | 0.6311 | 0.1102 | 0.3106 | -1.0 | -1.0 | 0.3099 | 0.4388 |
0.8489 | 9.0 | 5625 | 1.3444 | 0.2083 | 0.4493 | 0.1755 | 0.0208 | 0.2189 | 0.4433 | 0.1068 | 0.2774 | 0.3699 | 0.1262 | 0.3958 | 0.6256 | 0.1028 | 0.2967 | -1.0 | -1.0 | 0.3139 | 0.4432 |
0.8368 | 10.0 | 6250 | 1.3675 | 0.2077 | 0.4556 | 0.165 | 0.0195 | 0.2149 | 0.4571 | 0.1063 | 0.2752 | 0.365 | 0.1145 | 0.3974 | 0.6167 | 0.1121 | 0.3125 | -1.0 | -1.0 | 0.3032 | 0.4175 |
0.8254 | 11.0 | 6875 | 1.3734 | 0.2043 | 0.4469 | 0.166 | 0.0163 | 0.2085 | 0.4471 | 0.1063 | 0.2736 | 0.3587 | 0.1119 | 0.3831 | 0.6211 | 0.12 | 0.3111 | -1.0 | -1.0 | 0.2887 | 0.4063 |
0.7777 | 12.0 | 7500 | 1.3421 | 0.2221 | 0.4765 | 0.1863 | 0.0251 | 0.2372 | 0.4422 | 0.112 | 0.2868 | 0.3661 | 0.114 | 0.4123 | 0.5855 | 0.1198 | 0.3029 | -1.0 | -1.0 | 0.3243 | 0.4294 |
0.7543 | 13.0 | 8125 | 1.3643 | 0.2068 | 0.4643 | 0.1552 | 0.0208 | 0.2189 | 0.4503 | 0.1048 | 0.277 | 0.364 | 0.1266 | 0.3929 | 0.6074 | 0.1117 | 0.2962 | -1.0 | -1.0 | 0.3019 | 0.4319 |
0.7411 | 14.0 | 8750 | 1.3261 | 0.2298 | 0.4905 | 0.1943 | 0.0264 | 0.2453 | 0.4729 | 0.1139 | 0.296 | 0.3803 | 0.1241 | 0.4168 | 0.6243 | 0.1263 | 0.3121 | -1.0 | -1.0 | 0.3332 | 0.4484 |
0.7089 | 15.0 | 9375 | 1.3016 | 0.2356 | 0.5073 | 0.1925 | 0.0311 | 0.2456 | 0.493 | 0.1143 | 0.3011 | 0.3897 | 0.134 | 0.4233 | 0.6404 | 0.1475 | 0.3352 | -1.0 | -1.0 | 0.3238 | 0.4441 |
0.6699 | 16.0 | 10000 | 1.3116 | 0.2401 | 0.4956 | 0.209 | 0.0259 | 0.2596 | 0.4876 | 0.1183 | 0.2998 | 0.3889 | 0.1338 | 0.4264 | 0.6333 | 0.1394 | 0.3276 | -1.0 | -1.0 | 0.3408 | 0.4502 |
0.6488 | 17.0 | 10625 | 1.3128 | 0.2485 | 0.5142 | 0.2173 | 0.0327 | 0.2681 | 0.5097 | 0.1204 | 0.3139 | 0.4007 | 0.1249 | 0.4425 | 0.6549 | 0.1575 | 0.3454 | -1.0 | -1.0 | 0.3395 | 0.4559 |
0.6409 | 18.0 | 11250 | 1.3760 | 0.2218 | 0.4971 | 0.1709 | 0.0263 | 0.2363 | 0.4789 | 0.1099 | 0.2864 | 0.3592 | 0.1296 | 0.389 | 0.5936 | 0.1266 | 0.288 | -1.0 | -1.0 | 0.317 | 0.4305 |
0.6173 | 19.0 | 11875 | 1.3561 | 0.2387 | 0.518 | 0.1908 | 0.0276 | 0.2562 | 0.5002 | 0.1158 | 0.3049 | 0.383 | 0.1177 | 0.4209 | 0.6384 | 0.1521 | 0.3233 | -1.0 | -1.0 | 0.3253 | 0.4427 |
0.5787 | 20.0 | 12500 | 1.3105 | 0.2557 | 0.5258 | 0.2245 | 0.0292 | 0.2747 | 0.5163 | 0.1207 | 0.3192 | 0.4005 | 0.1284 | 0.44 | 0.6581 | 0.1664 | 0.3476 | -1.0 | -1.0 | 0.345 | 0.4534 |
0.5574 | 21.0 | 13125 | 1.3450 | 0.2512 | 0.5275 | 0.2095 | 0.0285 | 0.2725 | 0.5134 | 0.121 | 0.3122 | 0.3899 | 0.1277 | 0.4264 | 0.6434 | 0.1636 | 0.328 | -1.0 | -1.0 | 0.3389 | 0.4518 |
0.5452 | 22.0 | 13750 | 1.3460 | 0.2546 | 0.527 | 0.2155 | 0.0313 | 0.273 | 0.5167 | 0.1231 | 0.3163 | 0.3955 | 0.1306 | 0.4367 | 0.6411 | 0.1621 | 0.335 | -1.0 | -1.0 | 0.3471 | 0.456 |
0.5285 | 23.0 | 14375 | 1.3530 | 0.2474 | 0.5259 | 0.1986 | 0.0303 | 0.2657 | 0.5171 | 0.119 | 0.3111 | 0.39 | 0.1297 | 0.4273 | 0.6407 | 0.166 | 0.3381 | -1.0 | -1.0 | 0.3289 | 0.442 |
0.5034 | 24.0 | 15000 | 1.3436 | 0.2531 | 0.5296 | 0.2141 | 0.0334 | 0.2758 | 0.5099 | 0.1214 | 0.3166 | 0.395 | 0.1308 | 0.4372 | 0.6379 | 0.1635 | 0.3396 | -1.0 | -1.0 | 0.3428 | 0.4504 |
0.4929 | 25.0 | 15625 | 1.3706 | 0.251 | 0.5315 | 0.2048 | 0.029 | 0.2698 | 0.5149 | 0.1206 | 0.3141 | 0.3897 | 0.1255 | 0.4274 | 0.6432 | 0.1679 | 0.3355 | -1.0 | -1.0 | 0.3341 | 0.4439 |
0.479 | 26.0 | 16250 | 1.3653 | 0.2509 | 0.5301 | 0.2096 | 0.0316 | 0.272 | 0.5133 | 0.1205 | 0.3142 | 0.3897 | 0.1268 | 0.4294 | 0.6379 | 0.1653 | 0.3321 | -1.0 | -1.0 | 0.3365 | 0.4473 |
0.4751 | 27.0 | 16875 | 1.3693 | 0.2527 | 0.5319 | 0.2119 | 0.0323 | 0.274 | 0.5166 | 0.1216 | 0.315 | 0.3908 | 0.1289 | 0.4289 | 0.6412 | 0.1657 | 0.3324 | -1.0 | -1.0 | 0.3396 | 0.4491 |
0.4595 | 28.0 | 17500 | 1.3686 | 0.2547 | 0.5322 | 0.2184 | 0.0315 | 0.2762 | 0.5183 | 0.1221 | 0.316 | 0.3923 | 0.1285 | 0.4312 | 0.6429 | 0.168 | 0.3338 | -1.0 | -1.0 | 0.3413 | 0.4507 |
0.4589 | 29.0 | 18125 | 1.3683 | 0.2541 | 0.5323 | 0.2133 | 0.0305 | 0.2745 | 0.519 | 0.1218 | 0.3155 | 0.3918 | 0.1283 | 0.4297 | 0.6447 | 0.1683 | 0.3353 | -1.0 | -1.0 | 0.3399 | 0.4483 |
0.4555 | 30.0 | 18750 | 1.3675 | 0.2538 | 0.5322 | 0.2143 | 0.0306 | 0.2748 | 0.5189 | 0.1217 | 0.3156 | 0.392 | 0.1285 | 0.4301 | 0.6443 | 0.1681 | 0.3351 | -1.0 | -1.0 | 0.3395 | 0.4489 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for toukapy/detr_finetuned_kitti_mots-noaug-good-1
Base model
microsoft/conditional-detr-resnet-50