TR-Fin-Table-Structure-HoixiFinetuned-Overdose
This model is a fine-tuned version of microsoft/table-transformer-structure-recognition-v1.1-all on the tr-fin_table dataset. It achieves the following results on the evaluation set:
- Loss: 1.3453
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.5699 | 12.5 | 50 | 1.8167 |
2.4601 | 25.0 | 100 | 1.6425 |
2.2612 | 37.5 | 150 | 1.6247 |
2.2122 | 50.0 | 200 | 1.5704 |
1.8015 | 62.5 | 250 | 1.5473 |
2.0555 | 75.0 | 300 | 1.5278 |
2.3249 | 87.5 | 350 | 1.5308 |
1.4308 | 100.0 | 400 | 1.5316 |
1.9009 | 112.5 | 450 | 1.5131 |
2.2087 | 125.0 | 500 | 1.5006 |
2.0473 | 137.5 | 550 | 1.5212 |
2.5294 | 150.0 | 600 | 1.5379 |
1.9833 | 162.5 | 650 | 1.5652 |
1.8314 | 175.0 | 700 | 1.4976 |
2.4367 | 187.5 | 750 | 1.5416 |
1.9989 | 200.0 | 800 | 1.5043 |
1.8754 | 212.5 | 850 | 1.5266 |
2.0064 | 225.0 | 900 | 1.5051 |
2.0185 | 237.5 | 950 | 1.5313 |
1.6792 | 250.0 | 1000 | 1.4967 |
1.7105 | 262.5 | 1050 | 1.4732 |
1.9245 | 275.0 | 1100 | 1.5032 |
1.4862 | 287.5 | 1150 | 1.5018 |
2.1234 | 300.0 | 1200 | 1.4667 |
1.7904 | 312.5 | 1250 | 1.4761 |
2.4147 | 325.0 | 1300 | 1.4684 |
1.8115 | 337.5 | 1350 | 1.4711 |
1.9528 | 350.0 | 1400 | 1.4457 |
1.9951 | 362.5 | 1450 | 1.4618 |
1.5924 | 375.0 | 1500 | 1.4597 |
1.9166 | 387.5 | 1550 | 1.4513 |
1.8729 | 400.0 | 1600 | 1.4332 |
2.0891 | 412.5 | 1650 | 1.4299 |
2.1724 | 425.0 | 1700 | 1.4578 |
2.0618 | 437.5 | 1750 | 1.4791 |
1.7166 | 450.0 | 1800 | 1.4734 |
2.027 | 462.5 | 1850 | 1.4736 |
1.7804 | 475.0 | 1900 | 1.4461 |
2.3921 | 487.5 | 1950 | 1.4398 |
1.8792 | 500.0 | 2000 | 1.4345 |
2.1287 | 512.5 | 2050 | 1.4477 |
1.5598 | 525.0 | 2100 | 1.4608 |
1.7381 | 537.5 | 2150 | 1.4428 |
1.8059 | 550.0 | 2200 | 1.4423 |
1.7971 | 562.5 | 2250 | 1.4109 |
1.6301 | 575.0 | 2300 | 1.4341 |
1.9655 | 587.5 | 2350 | 1.4502 |
1.4477 | 600.0 | 2400 | 1.4378 |
1.7368 | 612.5 | 2450 | 1.4445 |
1.9277 | 625.0 | 2500 | 1.4349 |
1.8093 | 637.5 | 2550 | 1.4542 |
1.8594 | 650.0 | 2600 | 1.4462 |
1.7637 | 662.5 | 2650 | 1.4250 |
1.9192 | 675.0 | 2700 | 1.4681 |
1.86 | 687.5 | 2750 | 1.4827 |
1.8954 | 700.0 | 2800 | 1.4187 |
1.4728 | 712.5 | 2850 | 1.4129 |
1.6828 | 725.0 | 2900 | 1.4113 |
1.7694 | 737.5 | 2950 | 1.4035 |
1.805 | 750.0 | 3000 | 1.4134 |
1.8506 | 762.5 | 3050 | 1.4135 |
1.8127 | 775.0 | 3100 | 1.4057 |
1.9829 | 787.5 | 3150 | 1.4102 |
2.0491 | 800.0 | 3200 | 1.4216 |
1.549 | 812.5 | 3250 | 1.3648 |
1.8095 | 825.0 | 3300 | 1.4064 |
1.6556 | 837.5 | 3350 | 1.3776 |
1.5418 | 850.0 | 3400 | 1.3942 |
1.5569 | 862.5 | 3450 | 1.4009 |
2.3074 | 875.0 | 3500 | 1.4011 |
1.6733 | 887.5 | 3550 | 1.4032 |
1.9263 | 900.0 | 3600 | 1.3799 |
1.9946 | 912.5 | 3650 | 1.3972 |
1.4845 | 925.0 | 3700 | 1.3853 |
1.9587 | 937.5 | 3750 | 1.4146 |
1.7828 | 950.0 | 3800 | 1.4037 |
2.012 | 962.5 | 3850 | 1.4235 |
1.817 | 975.0 | 3900 | 1.4007 |
2.0893 | 987.5 | 3950 | 1.4095 |
2.1338 | 1000.0 | 4000 | 1.3824 |
1.8228 | 1012.5 | 4050 | 1.3843 |
1.6272 | 1025.0 | 4100 | 1.4122 |
1.6202 | 1037.5 | 4150 | 1.3909 |
1.4482 | 1050.0 | 4200 | 1.3589 |
1.949 | 1062.5 | 4250 | 1.3605 |
2.0954 | 1075.0 | 4300 | 1.3869 |
1.4728 | 1087.5 | 4350 | 1.3944 |
1.5916 | 1100.0 | 4400 | 1.3825 |
1.7988 | 1112.5 | 4450 | 1.3682 |
1.5051 | 1125.0 | 4500 | 1.3719 |
1.8492 | 1137.5 | 4550 | 1.4000 |
1.6146 | 1150.0 | 4600 | 1.3886 |
1.9732 | 1162.5 | 4650 | 1.3769 |
1.7256 | 1175.0 | 4700 | 1.3717 |
1.9683 | 1187.5 | 4750 | 1.3849 |
1.6818 | 1200.0 | 4800 | 1.3951 |
1.5879 | 1212.5 | 4850 | 1.3903 |
1.8743 | 1225.0 | 4900 | 1.3988 |
1.7887 | 1237.5 | 4950 | 1.3970 |
1.7302 | 1250.0 | 5000 | 1.3774 |
1.6503 | 1262.5 | 5050 | 1.4183 |
1.6207 | 1275.0 | 5100 | 1.3892 |
1.9589 | 1287.5 | 5150 | 1.4226 |
1.9163 | 1300.0 | 5200 | 1.4142 |
1.869 | 1312.5 | 5250 | 1.3777 |
1.601 | 1325.0 | 5300 | 1.3743 |
1.5548 | 1337.5 | 5350 | 1.3871 |
1.6482 | 1350.0 | 5400 | 1.4068 |
1.545 | 1362.5 | 5450 | 1.4012 |
1.292 | 1375.0 | 5500 | 1.4138 |
1.5313 | 1387.5 | 5550 | 1.4066 |
1.5981 | 1400.0 | 5600 | 1.4022 |
1.6622 | 1412.5 | 5650 | 1.4069 |
1.7446 | 1425.0 | 5700 | 1.3957 |
1.9459 | 1437.5 | 5750 | 1.4085 |
1.6468 | 1450.0 | 5800 | 1.4191 |
1.6107 | 1462.5 | 5850 | 1.3973 |
1.5986 | 1475.0 | 5900 | 1.3834 |
1.6157 | 1487.5 | 5950 | 1.3983 |
1.7203 | 1500.0 | 6000 | 1.3696 |
1.7985 | 1512.5 | 6050 | 1.3884 |
1.9865 | 1525.0 | 6100 | 1.3951 |
1.5754 | 1537.5 | 6150 | 1.3935 |
1.7058 | 1550.0 | 6200 | 1.3856 |
1.7909 | 1562.5 | 6250 | 1.3916 |
2.0516 | 1575.0 | 6300 | 1.3532 |
1.787 | 1587.5 | 6350 | 1.4099 |
1.6804 | 1600.0 | 6400 | 1.4122 |
1.8824 | 1612.5 | 6450 | 1.3876 |
1.4672 | 1625.0 | 6500 | 1.3845 |
1.5871 | 1637.5 | 6550 | 1.3900 |
1.899 | 1650.0 | 6600 | 1.3777 |
1.3322 | 1662.5 | 6650 | 1.3765 |
1.6055 | 1675.0 | 6700 | 1.3556 |
2.226 | 1687.5 | 6750 | 1.3798 |
1.3981 | 1700.0 | 6800 | 1.3695 |
1.6295 | 1712.5 | 6850 | 1.3579 |
1.5333 | 1725.0 | 6900 | 1.3714 |
1.5442 | 1737.5 | 6950 | 1.3709 |
1.2871 | 1750.0 | 7000 | 1.3615 |
1.6814 | 1762.5 | 7050 | 1.3742 |
1.4199 | 1775.0 | 7100 | 1.3683 |
1.6349 | 1787.5 | 7150 | 1.3593 |
1.4781 | 1800.0 | 7200 | 1.3633 |
1.9904 | 1812.5 | 7250 | 1.3705 |
1.6171 | 1825.0 | 7300 | 1.3768 |
1.7736 | 1837.5 | 7350 | 1.3753 |
1.7629 | 1850.0 | 7400 | 1.3719 |
1.6829 | 1862.5 | 7450 | 1.3687 |
1.4467 | 1875.0 | 7500 | 1.3606 |
1.8322 | 1887.5 | 7550 | 1.3759 |
1.9977 | 1900.0 | 7600 | 1.3839 |
1.6281 | 1912.5 | 7650 | 1.3877 |
1.4727 | 1925.0 | 7700 | 1.3922 |
1.739 | 1937.5 | 7750 | 1.3922 |
2.0781 | 1950.0 | 7800 | 1.4001 |
1.8195 | 1962.5 | 7850 | 1.3875 |
1.7775 | 1975.0 | 7900 | 1.3743 |
1.5131 | 1987.5 | 7950 | 1.3774 |
1.5687 | 2000.0 | 8000 | 1.3767 |
1.6019 | 2012.5 | 8050 | 1.3773 |
1.2421 | 2025.0 | 8100 | 1.3663 |
1.5391 | 2037.5 | 8150 | 1.3599 |
1.8665 | 2050.0 | 8200 | 1.3744 |
1.7484 | 2062.5 | 8250 | 1.3667 |
1.5384 | 2075.0 | 8300 | 1.3483 |
1.4885 | 2087.5 | 8350 | 1.3664 |
1.8017 | 2100.0 | 8400 | 1.3662 |
1.4904 | 2112.5 | 8450 | 1.3577 |
1.6576 | 2125.0 | 8500 | 1.3727 |
1.5057 | 2137.5 | 8550 | 1.3647 |
1.8728 | 2150.0 | 8600 | 1.3558 |
1.8287 | 2162.5 | 8650 | 1.3604 |
1.4705 | 2175.0 | 8700 | 1.3586 |
1.6126 | 2187.5 | 8750 | 1.3818 |
1.6838 | 2200.0 | 8800 | 1.3756 |
1.5985 | 2212.5 | 8850 | 1.3683 |
1.9316 | 2225.0 | 8900 | 1.3554 |
1.7605 | 2237.5 | 8950 | 1.3485 |
1.8473 | 2250.0 | 9000 | 1.3679 |
1.5161 | 2262.5 | 9050 | 1.3440 |
1.38 | 2275.0 | 9100 | 1.3578 |
1.2987 | 2287.5 | 9150 | 1.3477 |
1.6364 | 2300.0 | 9200 | 1.3497 |
1.3951 | 2312.5 | 9250 | 1.3630 |
1.3344 | 2325.0 | 9300 | 1.3498 |
1.3916 | 2337.5 | 9350 | 1.3503 |
1.7832 | 2350.0 | 9400 | 1.3502 |
1.377 | 2362.5 | 9450 | 1.3512 |
1.3797 | 2375.0 | 9500 | 1.3507 |
1.4729 | 2387.5 | 9550 | 1.3533 |
1.5299 | 2400.0 | 9600 | 1.3544 |
1.6858 | 2412.5 | 9650 | 1.3447 |
1.3794 | 2425.0 | 9700 | 1.3432 |
1.8406 | 2437.5 | 9750 | 1.3449 |
1.8643 | 2450.0 | 9800 | 1.3394 |
1.5886 | 2462.5 | 9850 | 1.3452 |
2.065 | 2475.0 | 9900 | 1.3461 |
1.7918 | 2487.5 | 9950 | 1.3447 |
1.3398 | 2500.0 | 10000 | 1.3453 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.0
- Downloads last month
- 77
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support