train_rte_1744902659
This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B-Instruct on the rte dataset. It achieves the following results on the evaluation set:
- Loss: 0.1560
- Num Input Tokens Seen: 98761256
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.3
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 0.1763 | 1.4207 | 200 | 0.1636 | 496688 |
| 0.1578 | 2.8414 | 400 | 0.1577 | 991488 |
| 0.1612 | 4.2567 | 600 | 0.1604 | 1481464 |
| 0.155 | 5.6774 | 800 | 0.1737 | 1979088 |
| 0.1511 | 7.0927 | 1000 | 0.1661 | 2468504 |
| 0.1637 | 8.5134 | 1200 | 0.1808 | 2963120 |
| 0.1604 | 9.9340 | 1400 | 0.1603 | 3459048 |
| 0.1702 | 11.3494 | 1600 | 0.1652 | 3951104 |
| 0.149 | 12.7701 | 1800 | 0.1587 | 4445432 |
| 0.1563 | 14.1854 | 2000 | 0.1727 | 4938824 |
| 0.152 | 15.6061 | 2200 | 0.1624 | 5433720 |
| 0.1747 | 17.0214 | 2400 | 0.2194 | 5925896 |
| 0.1508 | 18.4421 | 2600 | 0.1591 | 6422360 |
| 0.1544 | 19.8627 | 2800 | 0.1613 | 6914152 |
| 0.1563 | 21.2781 | 3000 | 0.1628 | 7403976 |
| 0.1528 | 22.6988 | 3200 | 0.1650 | 7902520 |
| 0.1546 | 24.1141 | 3400 | 0.1616 | 8394080 |
| 0.1574 | 25.5348 | 3600 | 0.1560 | 8884224 |
| 0.156 | 26.9554 | 3800 | 0.1587 | 9382368 |
| 0.1586 | 28.3708 | 4000 | 0.1588 | 9872768 |
| 0.1482 | 29.7914 | 4200 | 0.1590 | 10366000 |
| 0.1513 | 31.2068 | 4400 | 0.1579 | 10867488 |
| 0.1604 | 32.6275 | 4600 | 0.1601 | 11358568 |
| 0.1562 | 34.0428 | 4800 | 0.1665 | 11852320 |
| 0.1522 | 35.4635 | 5000 | 0.1580 | 12343880 |
| 0.1504 | 36.8841 | 5200 | 0.1618 | 12837040 |
| 0.175 | 38.2995 | 5400 | 0.1635 | 13329368 |
| 0.148 | 39.7201 | 5600 | 0.1629 | 13828784 |
| 0.1459 | 41.1355 | 5800 | 0.1620 | 14315304 |
| 0.1527 | 42.5561 | 6000 | 0.1631 | 14806592 |
| 0.1525 | 43.9768 | 6200 | 0.1638 | 15305208 |
| 0.2929 | 45.3922 | 6400 | 0.2869 | 15791608 |
| 0.1587 | 46.8128 | 6600 | 0.1586 | 16292464 |
| 0.1579 | 48.2282 | 6800 | 0.1613 | 16781768 |
| 0.1482 | 49.6488 | 7000 | 0.1601 | 17278560 |
| 0.1495 | 51.0642 | 7200 | 0.1656 | 17769384 |
| 0.1584 | 52.4848 | 7400 | 0.1605 | 18262680 |
| 0.1537 | 53.9055 | 7600 | 0.1625 | 18763936 |
| 0.1521 | 55.3209 | 7800 | 0.1682 | 19258096 |
| 0.1542 | 56.7415 | 8000 | 0.1639 | 19753648 |
| 0.1431 | 58.1569 | 8200 | 0.1703 | 20244128 |
| 0.1517 | 59.5775 | 8400 | 0.1667 | 20739208 |
| 0.149 | 60.9982 | 8600 | 0.1700 | 21236872 |
| 0.1474 | 62.4135 | 8800 | 0.1610 | 21726944 |
| 0.1569 | 63.8342 | 9000 | 0.1627 | 22223288 |
| 0.1426 | 65.2496 | 9200 | 0.1676 | 22716672 |
| 0.1567 | 66.6702 | 9400 | 0.1668 | 23209088 |
| 0.1492 | 68.0856 | 9600 | 0.1652 | 23701520 |
| 0.1508 | 69.5062 | 9800 | 0.1692 | 24197944 |
| 0.1517 | 70.9269 | 10000 | 0.1674 | 24694272 |
| 0.1419 | 72.3422 | 10200 | 0.1700 | 25191256 |
| 0.1348 | 73.7629 | 10400 | 0.1719 | 25688288 |
| 0.143 | 75.1783 | 10600 | 0.1670 | 26177720 |
| 0.1419 | 76.5989 | 10800 | 0.1713 | 26675248 |
| 0.1403 | 78.0143 | 11000 | 0.1696 | 27168496 |
| 0.134 | 79.4349 | 11200 | 0.1882 | 27664360 |
| 0.1344 | 80.8556 | 11400 | 0.1876 | 28161984 |
| 0.1194 | 82.2709 | 11600 | 0.1797 | 28655448 |
| 0.1134 | 83.6916 | 11800 | 0.2085 | 29151808 |
| 0.1113 | 85.1070 | 12000 | 0.2133 | 29642952 |
| 0.0911 | 86.5276 | 12200 | 0.2172 | 30140536 |
| 0.1133 | 87.9483 | 12400 | 0.2262 | 30639808 |
| 0.0792 | 89.3636 | 12600 | 0.2621 | 31135048 |
| 0.0868 | 90.7843 | 12800 | 0.2296 | 31630256 |
| 0.0485 | 92.1996 | 13000 | 0.2493 | 32121256 |
| 0.0232 | 93.6203 | 13200 | 0.3262 | 32618184 |
| 0.027 | 95.0357 | 13400 | 0.3350 | 33115432 |
| 0.0219 | 96.4563 | 13600 | 0.3802 | 33609472 |
| 0.0144 | 97.8770 | 13800 | 0.3693 | 34098712 |
| 0.004 | 99.2923 | 14000 | 0.4255 | 34590368 |
| 0.0176 | 100.7130 | 14200 | 0.3845 | 35081248 |
| 0.0032 | 102.1283 | 14400 | 0.4501 | 35571464 |
| 0.0049 | 103.5490 | 14600 | 0.4454 | 36063824 |
| 0.0046 | 104.9697 | 14800 | 0.4598 | 36557944 |
| 0.0024 | 106.3850 | 15000 | 0.4496 | 37048560 |
| 0.0202 | 107.8057 | 15200 | 0.4188 | 37543928 |
| 0.0053 | 109.2210 | 15400 | 0.3996 | 38035968 |
| 0.0035 | 110.6417 | 15600 | 0.4336 | 38526000 |
| 0.0005 | 112.0570 | 15800 | 0.5136 | 39021440 |
| 0.0004 | 113.4777 | 16000 | 0.4994 | 39519712 |
| 0.0002 | 114.8984 | 16200 | 0.5459 | 40014440 |
| 0.0001 | 116.3137 | 16400 | 0.5487 | 40509368 |
| 0.0001 | 117.7344 | 16600 | 0.5664 | 41001000 |
| 0.0001 | 119.1497 | 16800 | 0.5936 | 41492672 |
| 0.0001 | 120.5704 | 17000 | 0.5985 | 41991984 |
| 0.0001 | 121.9911 | 17200 | 0.6173 | 42486736 |
| 0.0 | 123.4064 | 17400 | 0.6299 | 42979888 |
| 0.0001 | 124.8271 | 17600 | 0.6455 | 43473920 |
| 0.0 | 126.2424 | 17800 | 0.6472 | 43963728 |
| 0.0 | 127.6631 | 18000 | 0.6632 | 44457208 |
| 0.0 | 129.0784 | 18200 | 0.6749 | 44952664 |
| 0.0 | 130.4991 | 18400 | 0.6809 | 45446704 |
| 0.0 | 131.9198 | 18600 | 0.6892 | 45936552 |
| 0.0 | 133.3351 | 18800 | 0.6927 | 46426240 |
| 0.0 | 134.7558 | 19000 | 0.6884 | 46921256 |
| 0.0 | 136.1711 | 19200 | 0.7196 | 47412080 |
| 0.0 | 137.5918 | 19400 | 0.7175 | 47911024 |
| 0.0 | 139.0071 | 19600 | 0.7461 | 48404752 |
| 0.0 | 140.4278 | 19800 | 0.7344 | 48901416 |
| 0.0 | 141.8485 | 20000 | 0.7354 | 49400736 |
| 0.0 | 143.2638 | 20200 | 0.7601 | 49895752 |
| 0.0 | 144.6845 | 20400 | 0.7383 | 50380736 |
| 0.0045 | 146.0998 | 20600 | 0.7421 | 50871288 |
| 0.0 | 147.5205 | 20800 | 0.7749 | 51360328 |
| 0.0819 | 148.9412 | 21000 | 0.2368 | 51853696 |
| 0.0102 | 150.3565 | 21200 | 0.4455 | 52348712 |
| 0.0017 | 151.7772 | 21400 | 0.4607 | 52842992 |
| 0.0001 | 153.1925 | 21600 | 0.5198 | 53335368 |
| 0.0002 | 154.6132 | 21800 | 0.4956 | 53831240 |
| 0.0002 | 156.0285 | 22000 | 0.5451 | 54320840 |
| 0.0001 | 157.4492 | 22200 | 0.5620 | 54818304 |
| 0.0 | 158.8699 | 22400 | 0.5781 | 55310560 |
| 0.0001 | 160.2852 | 22600 | 0.5959 | 55805192 |
| 0.0001 | 161.7059 | 22800 | 0.6064 | 56294240 |
| 0.0 | 163.1212 | 23000 | 0.6171 | 56785216 |
| 0.0001 | 164.5419 | 23200 | 0.6276 | 57277112 |
| 0.0 | 165.9626 | 23400 | 0.6349 | 57768960 |
| 0.0 | 167.3779 | 23600 | 0.6519 | 58259216 |
| 0.0001 | 168.7986 | 23800 | 0.6514 | 58754552 |
| 0.0 | 170.2139 | 24000 | 0.6648 | 59250304 |
| 0.0 | 171.6346 | 24200 | 0.6740 | 59743752 |
| 0.0 | 173.0499 | 24400 | 0.6879 | 60240920 |
| 0.0 | 174.4706 | 24600 | 0.6971 | 60738488 |
| 0.0 | 175.8913 | 24800 | 0.7140 | 61232632 |
| 0.0 | 177.3066 | 25000 | 0.7076 | 61726896 |
| 0.0 | 178.7273 | 25200 | 0.7095 | 62220440 |
| 0.0 | 180.1426 | 25400 | 0.7190 | 62713544 |
| 0.0 | 181.5633 | 25600 | 0.7294 | 63208560 |
| 0.0 | 182.9840 | 25800 | 0.7262 | 63703320 |
| 0.0 | 184.3993 | 26000 | 0.7486 | 64195280 |
| 0.0041 | 185.8200 | 26200 | 0.7565 | 64693448 |
| 0.0 | 187.2353 | 26400 | 0.7590 | 65180864 |
| 0.0 | 188.6560 | 26600 | 0.7419 | 65680024 |
| 0.0 | 190.0713 | 26800 | 0.7771 | 66173368 |
| 0.0015 | 191.4920 | 27000 | 0.7760 | 66664968 |
| 0.0 | 192.9127 | 27200 | 0.7663 | 67157528 |
| 0.0 | 194.3280 | 27400 | 0.7157 | 67657848 |
| 0.0 | 195.7487 | 27600 | 0.7376 | 68154280 |
| 0.0 | 197.1640 | 27800 | 0.7593 | 68648760 |
| 0.0 | 198.5847 | 28000 | 0.8076 | 69145424 |
| 0.0 | 200.0 | 28200 | 0.7934 | 69634592 |
| 0.0 | 201.4207 | 28400 | 0.7994 | 70126824 |
| 0.0 | 202.8414 | 28600 | 0.8015 | 70621048 |
| 0.0025 | 204.2567 | 28800 | 0.8037 | 71112744 |
| 0.0 | 205.6774 | 29000 | 0.8194 | 71609328 |
| 0.0 | 207.0927 | 29200 | 0.8214 | 72096488 |
| 0.0 | 208.5134 | 29400 | 0.8394 | 72590600 |
| 0.0 | 209.9340 | 29600 | 0.8556 | 73085400 |
| 0.0 | 211.3494 | 29800 | 0.8367 | 73578704 |
| 0.0018 | 212.7701 | 30000 | 0.8551 | 74071832 |
| 0.0 | 214.1854 | 30200 | 0.8317 | 74558088 |
| 0.0 | 215.6061 | 30400 | 0.8396 | 75054720 |
| 0.0043 | 217.0214 | 30600 | 0.5733 | 75550968 |
| 0.0001 | 218.4421 | 30800 | 0.6028 | 76052048 |
| 0.0001 | 219.8627 | 31000 | 0.6069 | 76544760 |
| 0.0017 | 221.2781 | 31200 | 0.6315 | 77039312 |
| 0.0 | 222.6988 | 31400 | 0.6448 | 77536608 |
| 0.0 | 224.1141 | 31600 | 0.6545 | 78029096 |
| 0.0 | 225.5348 | 31800 | 0.6657 | 78521640 |
| 0.0 | 226.9554 | 32000 | 0.6726 | 79014704 |
| 0.0 | 228.3708 | 32200 | 0.6799 | 79509056 |
| 0.0018 | 229.7914 | 32400 | 0.6913 | 80004760 |
| 0.0 | 231.2068 | 32600 | 0.6914 | 80498576 |
| 0.0 | 232.6275 | 32800 | 0.6986 | 80992160 |
| 0.0014 | 234.0428 | 33000 | 0.7049 | 81484216 |
| 0.0 | 235.4635 | 33200 | 0.7092 | 81981536 |
| 0.0015 | 236.8841 | 33400 | 0.7172 | 82469112 |
| 0.0 | 238.2995 | 33600 | 0.7233 | 82967264 |
| 0.0 | 239.7201 | 33800 | 0.7240 | 83460632 |
| 0.0 | 241.1355 | 34000 | 0.7369 | 83946936 |
| 0.0 | 242.5561 | 34200 | 0.7378 | 84438976 |
| 0.0 | 243.9768 | 34400 | 0.7432 | 84936992 |
| 0.0 | 245.3922 | 34600 | 0.7521 | 85424648 |
| 0.0008 | 246.8128 | 34800 | 0.7604 | 85921552 |
| 0.0 | 248.2282 | 35000 | 0.7649 | 86414392 |
| 0.0 | 249.6488 | 35200 | 0.7753 | 86904424 |
| 0.0 | 251.0642 | 35400 | 0.7900 | 87399560 |
| 0.0 | 252.4848 | 35600 | 0.7888 | 87900568 |
| 0.0001 | 253.9055 | 35800 | 0.7993 | 88391952 |
| 0.0 | 255.3209 | 36000 | 0.7991 | 88887288 |
| 0.0 | 256.7415 | 36200 | 0.8028 | 89375944 |
| 0.0 | 258.1569 | 36400 | 0.8089 | 89868176 |
| 0.0 | 259.5775 | 36600 | 0.8126 | 90365056 |
| 0.0 | 260.9982 | 36800 | 0.8162 | 90855096 |
| 0.0 | 262.4135 | 37000 | 0.8197 | 91348504 |
| 0.0 | 263.8342 | 37200 | 0.8228 | 91843280 |
| 0.0 | 265.2496 | 37400 | 0.8229 | 92339160 |
| 0.0 | 266.6702 | 37600 | 0.8240 | 92834936 |
| 0.0 | 268.0856 | 37800 | 0.8272 | 93329096 |
| 0.0 | 269.5062 | 38000 | 0.8250 | 93825960 |
| 0.0 | 270.9269 | 38200 | 0.8273 | 94316976 |
| 0.0 | 272.3422 | 38400 | 0.8304 | 94808456 |
| 0.0 | 273.7629 | 38600 | 0.8315 | 95304384 |
| 0.0 | 275.1783 | 38800 | 0.8326 | 95796256 |
| 0.0 | 276.5989 | 39000 | 0.8341 | 96293992 |
| 0.0 | 278.0143 | 39200 | 0.8318 | 96783960 |
| 0.0 | 279.4349 | 39400 | 0.8350 | 97275176 |
| 0.0 | 280.8556 | 39600 | 0.8345 | 97769584 |
| 0.0 | 282.2709 | 39800 | 0.8277 | 98266712 |
| 0.0 | 283.6916 | 40000 | 0.8306 | 98761256 |
Framework versions
- PEFT 0.15.1
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_rte_1744902659
Base model
meta-llama/Meta-Llama-3-8B-Instruct