Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

whisper-20hrs-meta

This model is a fine-tuned version of openai/whisper-large-v2 on the JASMIN-CGN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4035
  • Wer: 19.7571

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 48
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 55
  • num_epochs: 3.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.0357 0.1374 25 1.2179 38.0448
1.0051 0.2747 50 1.1769 37.1523
0.9812 0.4121 75 1.0983 36.4377
0.8721 0.5495 100 1.0066 35.1360
0.8113 0.6868 125 0.9084 34.5087
0.7209 0.8242 150 0.8029 32.2676
0.6349 0.9615 175 0.6938 31.7241
0.5759 1.0989 200 0.6019 29.6105
0.5171 1.2363 225 0.5404 26.2220
0.4895 1.3736 250 0.4981 25.0679
0.458 1.5110 275 0.4668 23.3066
0.4482 1.6484 300 0.4468 22.2230
0.4274 1.7857 325 0.4336 21.5620
0.4331 1.9231 350 0.4246 20.5455
0.418 2.0604 375 0.4185 20.5120
0.4347 2.1978 400 0.4140 20.0456
0.4128 2.3352 425 0.4101 19.9014
0.403 2.4725 450 0.4074 19.1834
0.4331 2.6099 475 0.4055 19.1230
0.3932 2.7473 500 0.4043 19.7370
0.4123 2.8846 525 0.4035 19.7571

Framework versions

  • PEFT 0.16.0
  • Transformers 4.52.0
  • Pytorch 2.7.1+cu126
  • Datasets 3.6.0
  • Tokenizers 0.21.2
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for greenw0lf/whisper-20hrs-meta

Adapter
(276)
this model

Collection including greenw0lf/whisper-20hrs-meta

Evaluation results