Keisyahsq/BC2GM_BERT
This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0031
- Validation Loss: 0.1601
- Train Precision: 0.8543
- Train Recall: 0.8713
- Train F1: 0.8627
- Train Accuracy: 0.9712
- Epoch: 99
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 15620, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
| Train Loss | Validation Loss | Train Precision | Train Recall | Train F1 | Train Accuracy | Epoch |
|---|---|---|---|---|---|---|
| 0.1462 | 0.0941 | 0.7837 | 0.8515 | 0.8162 | 0.9639 | 0 |
| 0.0726 | 0.0885 | 0.8363 | 0.8419 | 0.8391 | 0.9680 | 1 |
| 0.0452 | 0.1000 | 0.8402 | 0.8563 | 0.8482 | 0.9689 | 2 |
| 0.0275 | 0.1060 | 0.8357 | 0.8658 | 0.8505 | 0.9689 | 3 |
| 0.0187 | 0.1246 | 0.8471 | 0.8662 | 0.8566 | 0.9691 | 4 |
| 0.0127 | 0.1268 | 0.8460 | 0.8725 | 0.8590 | 0.9703 | 5 |
| 0.0089 | 0.1451 | 0.8400 | 0.8780 | 0.8586 | 0.9702 | 6 |
| 0.0060 | 0.1555 | 0.8611 | 0.8558 | 0.8584 | 0.9713 | 7 |
| 0.0046 | 0.1541 | 0.8560 | 0.8732 | 0.8645 | 0.9713 | 8 |
| 0.0034 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 9 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 10 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 11 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 12 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 13 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 14 |
| 0.0027 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 15 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 16 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 17 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 18 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 19 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 20 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 21 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 22 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 23 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 24 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 25 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 26 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 27 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 28 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 29 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 30 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 31 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 32 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 33 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 34 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 35 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 36 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 37 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 38 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 39 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 40 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 41 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 42 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 43 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 44 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 45 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 46 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 47 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 48 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 49 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 50 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 51 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 52 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 53 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 54 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 55 |
| 0.0027 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 56 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 57 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 58 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 59 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 60 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 61 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 62 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 63 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 64 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 65 |
| 0.0027 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 66 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 67 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 68 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 69 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 70 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 71 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 72 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 73 |
| 0.0032 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 74 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 75 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 76 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 77 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 78 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 79 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 80 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 81 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 82 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 83 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 84 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 85 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 86 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 87 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 88 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 89 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 90 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 91 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 92 |
| 0.0028 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 93 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 94 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 95 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 96 |
| 0.0029 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 97 |
| 0.0030 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 98 |
| 0.0031 | 0.1601 | 0.8543 | 0.8713 | 0.8627 | 0.9712 | 99 |
Framework versions
- Transformers 4.31.0
- TensorFlow 2.10.1
- Datasets 3.0.0
- Tokenizers 0.13.3
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Keisyahsq/BC2GM_BERT
Base model
google-bert/bert-base-uncased