You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

bomu-asr

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the SUDOPING01/DATASET-BO - NA dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1682
  • Wer: 0.1458

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 12
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 2
  • total_train_batch_size: 24
  • total_eval_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 15.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
3.4285 0.3356 100 3.4152 1.0
2.6743 0.6711 200 2.5978 0.9967
0.4489 1.0067 300 0.3998 0.4768
0.3195 1.3423 400 0.2667 0.3600
0.2897 1.6779 500 0.2343 0.3141
0.2234 2.0134 600 0.1843 0.2586
0.2174 2.3490 700 0.1698 0.2443
0.2425 2.6846 800 0.1626 0.2267
0.1674 3.0201 900 0.1450 0.2111
0.1351 3.3557 1000 0.1464 0.2070
0.1522 3.6913 1100 0.1405 0.2016
0.1385 4.0268 1200 0.1359 0.1903
0.1309 4.3624 1300 0.1349 0.1941
0.1358 4.6980 1400 0.1290 0.1837
0.0982 5.0336 1500 0.1240 0.1745
0.0923 5.3691 1600 0.1256 0.1786
0.1429 5.7047 1700 0.1235 0.1747
0.0903 6.0403 1800 0.1289 0.1718
0.0859 6.3758 1900 0.1205 0.1757
0.065 6.7114 2000 0.1188 0.1710
0.0834 7.0470 2100 0.1300 0.1708
0.0701 7.3826 2200 0.1258 0.1723
0.0682 7.7181 2300 0.1281 0.1648
0.0555 8.0537 2400 0.1237 0.1604
0.0662 8.3893 2500 0.1262 0.1588
0.0632 8.7248 2600 0.1291 0.1626
0.0448 9.0604 2700 0.1347 0.1620
0.0522 9.3960 2800 0.1351 0.1610
0.0552 9.7315 2900 0.1353 0.1604
0.0325 10.0671 3000 0.1470 0.1573
0.0378 10.4027 3100 0.1413 0.1581
0.0271 10.7383 3200 0.1469 0.1556
0.0434 11.0738 3300 0.1530 0.1559
0.0352 11.4094 3400 0.1546 0.1537
0.052 11.7450 3500 0.1512 0.1560
0.0803 12.0805 3600 0.1584 0.1523
0.0425 12.4161 3700 0.1588 0.1505
0.0327 12.7517 3800 0.1633 0.1507
0.0444 13.0872 3900 0.1660 0.1509
0.0275 13.4228 4000 0.1650 0.1481
0.0358 13.7584 4100 0.1616 0.1471
0.0338 14.0940 4200 0.1656 0.1466
0.0304 14.4295 4300 0.1653 0.1455
0.0268 14.7651 4400 0.1684 0.1456

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
40
Safetensors
Model size
315M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Panga-Azazia/bomu-asr

Finetuned
(648)
this model