Whisper - 22 hours of training data
Collection
Collection of different data selection methods, where the amount of training data is the same
•
3 items
•
Updated
This model is a fine-tuned version of openai/whisper-large-v2 on the JASMIN-CGN dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 1.0357 | 0.1374 | 25 | 1.2179 | 38.0448 |
| 1.0051 | 0.2747 | 50 | 1.1769 | 37.1523 |
| 0.9812 | 0.4121 | 75 | 1.0983 | 36.4377 |
| 0.8721 | 0.5495 | 100 | 1.0066 | 35.1360 |
| 0.8113 | 0.6868 | 125 | 0.9084 | 34.5087 |
| 0.7209 | 0.8242 | 150 | 0.8029 | 32.2676 |
| 0.6349 | 0.9615 | 175 | 0.6938 | 31.7241 |
| 0.5759 | 1.0989 | 200 | 0.6019 | 29.6105 |
| 0.5171 | 1.2363 | 225 | 0.5404 | 26.2220 |
| 0.4895 | 1.3736 | 250 | 0.4981 | 25.0679 |
| 0.458 | 1.5110 | 275 | 0.4668 | 23.3066 |
| 0.4482 | 1.6484 | 300 | 0.4468 | 22.2230 |
| 0.4274 | 1.7857 | 325 | 0.4336 | 21.5620 |
| 0.4331 | 1.9231 | 350 | 0.4246 | 20.5455 |
| 0.418 | 2.0604 | 375 | 0.4185 | 20.5120 |
| 0.4347 | 2.1978 | 400 | 0.4140 | 20.0456 |
| 0.4128 | 2.3352 | 425 | 0.4101 | 19.9014 |
| 0.403 | 2.4725 | 450 | 0.4074 | 19.1834 |
| 0.4331 | 2.6099 | 475 | 0.4055 | 19.1230 |
| 0.3932 | 2.7473 | 500 | 0.4043 | 19.7370 |
| 0.4123 | 2.8846 | 525 | 0.4035 | 19.7571 |
Base model
openai/whisper-large-v2