HDCSHseg_models

This model is a fine-tuned version of nvidia/mit-b0 on the TommyClas/HDCSH_seg dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5558
  • Mean Iou: 0.3834
  • Mean Accuracy: 0.7668
  • Overall Accuracy: 0.7668
  • Accuracy 背景: nan
  • Accuracy 未水化水泥颗粒与高密度c-s-h混合: 0.7668
  • Iou 背景: 0.0
  • Iou 未水化水泥颗粒与高密度c-s-h混合: 0.7668

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: polynomial
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy 背景 Accuracy 未水化水泥颗粒与高密度c-s-h混合 Iou 背景 Iou 未水化水泥颗粒与高密度c-s-h混合
0.4717 1.0 100 0.5748 0.3844 0.7688 0.7688 nan 0.7688 0.0 0.7688
0.4353 2.0 200 0.5744 0.3665 0.7330 0.7330 nan 0.7330 0.0 0.7330
0.4298 3.0 300 0.5576 0.4389 0.8778 0.8778 nan 0.8778 0.0 0.8778
0.4229 4.0 400 0.5609 0.3817 0.7635 0.7635 nan 0.7635 0.0 0.7635
0.4175 5.0 500 0.5680 0.4079 0.8158 0.8158 nan 0.8158 0.0 0.8158
0.4081 6.0 600 0.5563 0.3917 0.7835 0.7835 nan 0.7835 0.0 0.7835
0.4002 7.0 700 0.6005 0.3144 0.6287 0.6287 nan 0.6287 0.0 0.6287
0.3904 8.0 800 0.5915 0.3378 0.6757 0.6757 nan 0.6757 0.0 0.6757
0.3944 9.0 900 0.5991 0.3534 0.7069 0.7069 nan 0.7069 0.0 0.7069
0.3805 10.0 1000 0.6093 0.3250 0.6500 0.6500 nan 0.6500 0.0 0.6500
0.3813 11.0 1100 0.5965 0.3201 0.6401 0.6401 nan 0.6401 0.0 0.6401
0.3685 12.0 1200 0.6125 0.3076 0.6151 0.6151 nan 0.6151 0.0 0.6151
0.37 13.0 1300 0.5745 0.4050 0.8099 0.8099 nan 0.8099 0.0 0.8099
0.3645 14.0 1400 0.5719 0.3685 0.7369 0.7369 nan 0.7369 0.0 0.7369
0.3643 15.0 1500 0.6316 0.2919 0.5839 0.5839 nan 0.5839 0.0 0.5839
0.3591 16.0 1600 0.6657 0.2730 0.5460 0.5460 nan 0.5460 0.0 0.5460
0.3572 17.0 1700 0.5844 0.3909 0.7817 0.7817 nan 0.7817 0.0 0.7817
0.361 18.0 1800 0.6129 0.3164 0.6328 0.6328 nan 0.6328 0.0 0.6328
0.3571 19.0 1900 0.5721 0.3744 0.7488 0.7488 nan 0.7488 0.0 0.7488
0.3571 20.0 2000 0.5961 0.3331 0.6662 0.6662 nan 0.6662 0.0 0.6662
0.356 21.0 2100 0.6015 0.3264 0.6529 0.6529 nan 0.6529 0.0 0.6529
0.3535 22.0 2200 0.5709 0.3636 0.7272 0.7272 nan 0.7272 0.0 0.7272
0.3511 23.0 2300 0.5912 0.3393 0.6786 0.6786 nan 0.6786 0.0 0.6786
0.3512 24.0 2400 0.5624 0.3725 0.7451 0.7451 nan 0.7451 0.0 0.7451
0.352 25.0 2500 0.5981 0.3490 0.6981 0.6981 nan 0.6981 0.0 0.6981
0.3523 26.0 2600 0.6001 0.3504 0.7008 0.7008 nan 0.7008 0.0 0.7008
0.3485 27.0 2700 0.5707 0.3596 0.7192 0.7192 nan 0.7192 0.0 0.7192
0.3499 28.0 2800 0.5805 0.3538 0.7076 0.7076 nan 0.7076 0.0 0.7076
0.3486 29.0 2900 0.5713 0.3630 0.7261 0.7261 nan 0.7261 0.0 0.7261
0.3494 30.0 3000 0.5824 0.3648 0.7295 0.7295 nan 0.7295 0.0 0.7295
0.348 31.0 3100 0.5707 0.3538 0.7076 0.7076 nan 0.7076 0.0 0.7076
0.3465 32.0 3200 0.5624 0.3765 0.7530 0.7530 nan 0.7530 0.0 0.7530
0.3473 33.0 3300 0.5723 0.3702 0.7405 0.7405 nan 0.7405 0.0 0.7405
0.3454 34.0 3400 0.5645 0.3953 0.7907 0.7907 nan 0.7907 0.0 0.7907
0.3466 35.0 3500 0.5618 0.3832 0.7663 0.7663 nan 0.7663 0.0 0.7663
0.3451 36.0 3600 0.5704 0.3535 0.7070 0.7070 nan 0.7070 0.0 0.7070
0.3459 37.0 3700 0.5625 0.3714 0.7427 0.7427 nan 0.7427 0.0 0.7427
0.345 38.0 3800 0.5720 0.3567 0.7135 0.7135 nan 0.7135 0.0 0.7135
0.3448 39.0 3900 0.5719 0.3688 0.7376 0.7376 nan 0.7376 0.0 0.7376
0.3444 40.0 4000 0.5646 0.3809 0.7618 0.7618 nan 0.7618 0.0 0.7618
0.343 41.0 4100 0.5525 0.3833 0.7665 0.7665 nan 0.7665 0.0 0.7665
0.3438 42.0 4200 0.5547 0.3888 0.7777 0.7777 nan 0.7777 0.0 0.7777
0.3425 43.0 4300 0.5579 0.3811 0.7622 0.7622 nan 0.7622 0.0 0.7622
0.3439 44.0 4400 0.5756 0.3577 0.7153 0.7153 nan 0.7153 0.0 0.7153
0.3422 45.0 4500 0.5612 0.3775 0.7550 0.7550 nan 0.7550 0.0 0.7550
0.3418 46.0 4600 0.5654 0.3783 0.7567 0.7567 nan 0.7567 0.0 0.7567
0.3408 47.0 4700 0.5787 0.3764 0.7529 0.7529 nan 0.7529 0.0 0.7529
0.3408 48.0 4800 0.5709 0.3717 0.7435 0.7435 nan 0.7435 0.0 0.7435
0.343 49.0 4900 0.5771 0.3472 0.6944 0.6944 nan 0.6944 0.0 0.6944
0.3395 50.0 5000 0.5552 0.3786 0.7572 0.7572 nan 0.7572 0.0 0.7572
0.3394 51.0 5100 0.5626 0.3632 0.7264 0.7264 nan 0.7264 0.0 0.7264
0.3396 52.0 5200 0.5580 0.3849 0.7697 0.7697 nan 0.7697 0.0 0.7697
0.3396 53.0 5300 0.5599 0.3669 0.7338 0.7338 nan 0.7338 0.0 0.7338
0.3397 54.0 5400 0.5610 0.3740 0.7480 0.7480 nan 0.7480 0.0 0.7480
0.3385 55.0 5500 0.5594 0.3836 0.7671 0.7671 nan 0.7671 0.0 0.7671
0.338 56.0 5600 0.5567 0.3940 0.7881 0.7881 nan 0.7881 0.0 0.7881
0.3378 57.0 5700 0.5648 0.3753 0.7506 0.7506 nan 0.7506 0.0 0.7506
0.3375 58.0 5800 0.5605 0.3795 0.7589 0.7589 nan 0.7589 0.0 0.7589
0.3364 59.0 5900 0.5653 0.3839 0.7678 0.7678 nan 0.7678 0.0 0.7678
0.3373 60.0 6000 0.5649 0.3989 0.7978 0.7978 nan 0.7978 0.0 0.7978
0.3367 61.0 6100 0.5661 0.3809 0.7617 0.7617 nan 0.7617 0.0 0.7617
0.3368 62.0 6200 0.5739 0.3818 0.7637 0.7637 nan 0.7637 0.0 0.7637
0.3352 63.0 6300 0.5631 0.3965 0.7930 0.7930 nan 0.7930 0.0 0.7930
0.336 64.0 6400 0.5722 0.3745 0.7490 0.7490 nan 0.7490 0.0 0.7490
0.3352 65.0 6500 0.5622 0.3864 0.7728 0.7728 nan 0.7728 0.0 0.7728
0.3356 66.0 6600 0.5627 0.3816 0.7631 0.7631 nan 0.7631 0.0 0.7631
0.3338 67.0 6700 0.5616 0.3741 0.7483 0.7483 nan 0.7483 0.0 0.7483
0.3343 68.0 6800 0.5657 0.3706 0.7412 0.7412 nan 0.7412 0.0 0.7412
0.3343 69.0 6900 0.5603 0.3805 0.7610 0.7610 nan 0.7610 0.0 0.7610
0.3345 70.0 7000 0.5608 0.3872 0.7744 0.7744 nan 0.7744 0.0 0.7744
0.3339 71.0 7100 0.5668 0.3855 0.7710 0.7710 nan 0.7710 0.0 0.7710
0.3342 72.0 7200 0.5625 0.3954 0.7909 0.7909 nan 0.7909 0.0 0.7909
0.3334 73.0 7300 0.5556 0.3790 0.7579 0.7579 nan 0.7579 0.0 0.7579
0.3336 74.0 7400 0.5555 0.3819 0.7639 0.7639 nan 0.7639 0.0 0.7639
0.3341 75.0 7500 0.5574 0.3782 0.7563 0.7563 nan 0.7563 0.0 0.7563
0.3326 76.0 7600 0.5628 0.3701 0.7401 0.7401 nan 0.7401 0.0 0.7401
0.3325 77.0 7700 0.5575 0.3818 0.7635 0.7635 nan 0.7635 0.0 0.7635
0.3333 78.0 7800 0.5515 0.3835 0.7670 0.7670 nan 0.7670 0.0 0.7670
0.3327 79.0 7900 0.5540 0.3776 0.7552 0.7552 nan 0.7552 0.0 0.7552
0.332 80.0 8000 0.5584 0.3852 0.7705 0.7705 nan 0.7705 0.0 0.7705
0.3322 81.0 8100 0.5616 0.3770 0.7540 0.7540 nan 0.7540 0.0 0.7540
0.3316 82.0 8200 0.5555 0.3793 0.7585 0.7585 nan 0.7585 0.0 0.7585
0.3325 83.0 8300 0.5567 0.3809 0.7618 0.7618 nan 0.7618 0.0 0.7618
0.3321 84.0 8400 0.5557 0.3803 0.7606 0.7606 nan 0.7606 0.0 0.7606
0.3314 85.0 8500 0.5545 0.3776 0.7553 0.7553 nan 0.7553 0.0 0.7553
0.3315 86.0 8600 0.5552 0.3805 0.7610 0.7610 nan 0.7610 0.0 0.7610
0.3313 87.0 8700 0.5550 0.3751 0.7501 0.7501 nan 0.7501 0.0 0.7501
0.3308 88.0 8800 0.5552 0.3839 0.7678 0.7678 nan 0.7678 0.0 0.7678
0.3315 89.0 8900 0.5547 0.3799 0.7598 0.7598 nan 0.7598 0.0 0.7598
0.3312 90.0 9000 0.5567 0.3771 0.7543 0.7543 nan 0.7543 0.0 0.7543
0.3314 91.0 9100 0.5536 0.3798 0.7597 0.7597 nan 0.7597 0.0 0.7597
0.3312 92.0 9200 0.5550 0.3789 0.7578 0.7578 nan 0.7578 0.0 0.7578
0.3308 93.0 9300 0.5555 0.3798 0.7596 0.7596 nan 0.7596 0.0 0.7596
0.3301 94.0 9400 0.5590 0.3784 0.7568 0.7568 nan 0.7568 0.0 0.7568
0.3306 95.0 9500 0.5563 0.3831 0.7662 0.7662 nan 0.7662 0.0 0.7662
0.3312 96.0 9600 0.5599 0.3758 0.7517 0.7517 nan 0.7517 0.0 0.7517
0.3309 97.0 9700 0.5552 0.3832 0.7663 0.7663 nan 0.7663 0.0 0.7663
0.331 98.0 9800 0.5558 0.3867 0.7734 0.7734 nan 0.7734 0.0 0.7734
0.3306 99.0 9900 0.5550 0.3849 0.7698 0.7698 nan 0.7698 0.0 0.7698
0.3307 100.0 10000 0.5558 0.3834 0.7668 0.7668 nan 0.7668 0.0 0.7668

Framework versions

  • Transformers 4.52.0.dev0
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
15
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for TommyClas/HDCSHseg_models

Base model

nvidia/mit-b0
Finetuned
(391)
this model