YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

SIB200 CDA Model with Gemma

This model was trained on the SIB200 dataset using Counterfactual Data Augmentation (CDA) with counterfactuals generated by Gemma.

Training Parameters

  • Dataset: SIB200
  • Mode: CDA
  • Selection Model: Gemma
  • Selection Method: Random
  • Train Size: 700 examples
  • Epochs: 20
  • Batch Size: 8
  • Effective Batch Size: 32 (batch_size * gradient_accumulation_steps)
  • Learning Rate: 8e-06
  • Patience: 8
  • Max Length: 192
  • Gradient Accumulation Steps: 4
  • Warmup Ratio: 0.1
  • Weight Decay: 0.01
  • Optimizer: AdamW
  • Scheduler: cosine_with_warmup
  • Random Seed: 42

Performance

  • Overall Accuracy: 77.95%
  • Overall Loss: 0.0237

Language-Specific Performance

  • English (EN): 86.87%
  • German (DE): 86.87%
  • Arabic (AR): 49.49%
  • Spanish (ES): 89.90%
  • Hindi (HI): 83.84%
  • Swahili (SW): 70.71%

Model Information

  • Base Model: bert-base-multilingual-cased
  • Task: Topic Classification
  • Languages: 6 languages (EN, DE, AR, ES, HI, SW)
Downloads last month
3
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including fledor/sib200_mbert_cda_gemma_multilingual