segformer-b1-GFB

This model is a fine-tuned version of nvidia/mit-b1 on the segments/GFB dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4646
  • Mean Iou: 0.6980
  • Mean Accuracy: 0.7985
  • Overall Accuracy: 0.9199
  • Accuracy Unlabeled: 0.9630
  • Accuracy Gbm: 0.8245
  • Accuracy Podo: 0.7482
  • Accuracy Endo: 0.6582
  • Iou Unlabeled: 0.9177
  • Iou Gbm: 0.7081
  • Iou Podo: 0.6223
  • Iou Endo: 0.5438

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Gbm Accuracy Podo Accuracy Endo Iou Unlabeled Iou Gbm Iou Podo Iou Endo
0.2783 1.0989 100 0.3191 0.5500 0.6348 0.8806 0.9690 0.6768 0.5333 0.3602 0.8818 0.5838 0.4362 0.2981
0.2116 2.1978 200 0.3145 0.5560 0.6580 0.8799 0.9503 0.8048 0.5330 0.3436 0.8833 0.5831 0.4525 0.3052
0.2333 3.2967 300 0.2593 0.6270 0.7281 0.9011 0.9627 0.7598 0.6617 0.5283 0.9021 0.6547 0.5276 0.4237
0.2253 4.3956 400 0.2582 0.6409 0.7774 0.9008 0.9374 0.8742 0.7192 0.5788 0.9022 0.6586 0.5714 0.4313
0.1657 5.4945 500 0.3006 0.6376 0.7280 0.9030 0.9710 0.6997 0.6728 0.5683 0.9004 0.6281 0.5650 0.4570
0.1814 6.5934 600 0.2519 0.6613 0.7745 0.9075 0.9518 0.8269 0.7246 0.5946 0.9056 0.6695 0.5892 0.4809
0.284 7.6923 700 0.2642 0.6567 0.7587 0.9082 0.9649 0.7735 0.6733 0.6230 0.9070 0.6609 0.5749 0.4842
0.1474 8.7912 800 0.2504 0.6612 0.7811 0.9085 0.9481 0.8704 0.7133 0.5927 0.9097 0.6758 0.5816 0.4776
0.1292 9.8901 900 0.2652 0.6621 0.7727 0.9084 0.9558 0.8163 0.7057 0.6132 0.9073 0.6753 0.5757 0.4899
0.1569 10.9890 1000 0.2582 0.6681 0.8105 0.9072 0.9406 0.8722 0.7224 0.7070 0.9063 0.6834 0.5882 0.4947
0.1352 12.0879 1100 0.2709 0.6750 0.7844 0.9133 0.9568 0.8251 0.7424 0.6131 0.9120 0.6829 0.6125 0.4927
0.0956 13.1868 1200 0.2897 0.6627 0.7500 0.9134 0.9670 0.8266 0.6794 0.5269 0.9127 0.6863 0.5835 0.4684
0.0922 14.2857 1300 0.2885 0.6842 0.7919 0.9160 0.9595 0.8328 0.7270 0.6482 0.9143 0.7005 0.6082 0.5139
0.1208 15.3846 1400 0.2828 0.6777 0.7899 0.9134 0.9587 0.8136 0.7226 0.6646 0.9117 0.6878 0.6032 0.5082
0.0951 16.4835 1500 0.3012 0.6705 0.7758 0.9114 0.9571 0.8140 0.7412 0.5908 0.9100 0.6753 0.6014 0.4953
0.1305 17.5824 1600 0.3082 0.6750 0.7871 0.9109 0.9564 0.7949 0.7425 0.6546 0.9080 0.6808 0.5958 0.5155
0.0844 18.6813 1700 0.2803 0.6841 0.7985 0.9144 0.9533 0.8252 0.7789 0.6365 0.9121 0.6971 0.6141 0.5130
0.0838 19.7802 1800 0.2975 0.6812 0.7853 0.9148 0.9585 0.8173 0.7563 0.6091 0.9125 0.6905 0.6150 0.5067
0.0883 20.8791 1900 0.3026 0.6821 0.7983 0.9135 0.9570 0.8001 0.7487 0.6874 0.9123 0.6876 0.5991 0.5295
0.0831 21.9780 2000 0.3038 0.6779 0.8143 0.9081 0.9384 0.8633 0.7737 0.6819 0.9044 0.6866 0.6012 0.5192
0.0771 23.0769 2100 0.2994 0.6818 0.7862 0.9152 0.9596 0.8046 0.7661 0.6146 0.9134 0.6917 0.6132 0.5090
0.0801 24.1758 2200 0.3132 0.6868 0.7923 0.9171 0.9579 0.8515 0.7399 0.6200 0.9156 0.7053 0.6123 0.5142
0.1058 25.2747 2300 0.3076 0.6868 0.8040 0.9143 0.9529 0.8166 0.7835 0.6631 0.9123 0.6933 0.6135 0.5280
0.0904 26.3736 2400 0.3152 0.6804 0.8201 0.9089 0.9380 0.8586 0.7875 0.6961 0.9056 0.6929 0.6000 0.5232
0.1197 27.4725 2500 0.3155 0.6866 0.7851 0.9171 0.9639 0.8077 0.7406 0.6283 0.9155 0.6966 0.6113 0.5230
0.0624 28.5714 2600 0.3164 0.6911 0.7993 0.9177 0.9628 0.8108 0.7286 0.6951 0.9159 0.6997 0.6148 0.5339
0.0782 29.6703 2700 0.3248 0.6941 0.7915 0.9191 0.9657 0.8091 0.7348 0.6564 0.9171 0.7056 0.6124 0.5415
0.0659 30.7692 2800 0.3203 0.6848 0.7859 0.9163 0.9635 0.8224 0.7055 0.6523 0.9141 0.6986 0.6017 0.5248
0.081 31.8681 2900 0.3249 0.6819 0.8141 0.9105 0.9402 0.8811 0.7664 0.6688 0.9081 0.6858 0.6093 0.5243
0.0851 32.9670 3000 0.3988 0.6659 0.7493 0.9119 0.9736 0.7579 0.6657 0.6002 0.9089 0.6686 0.5810 0.5053
0.0779 34.0659 3100 0.3361 0.6907 0.7972 0.9162 0.9584 0.8045 0.7768 0.6490 0.9135 0.6946 0.6192 0.5355
0.0626 35.1648 3200 0.3443 0.6860 0.7813 0.9173 0.9652 0.8134 0.7256 0.6210 0.9151 0.6949 0.6139 0.5203
0.0709 36.2637 3300 0.3250 0.6963 0.8080 0.9184 0.9560 0.8429 0.7673 0.6658 0.9166 0.7064 0.6243 0.5381
0.0661 37.3626 3400 0.3290 0.6919 0.7925 0.9187 0.9626 0.8309 0.7357 0.6407 0.9170 0.7068 0.6143 0.5295
0.0825 38.4615 3500 0.3347 0.6951 0.8019 0.9184 0.9568 0.8501 0.7592 0.6416 0.9163 0.7086 0.6203 0.5351
0.0668 39.5604 3600 0.3523 0.6860 0.8081 0.9133 0.9496 0.8334 0.7746 0.6750 0.9105 0.6947 0.6115 0.5272
0.0604 40.6593 3700 0.4000 0.6831 0.7693 0.9167 0.9704 0.7860 0.7129 0.6079 0.9137 0.6886 0.6102 0.5201
0.0913 41.7582 3800 0.3509 0.6967 0.8120 0.9179 0.9564 0.8277 0.7661 0.6977 0.9157 0.7064 0.6206 0.5442
0.045 42.8571 3900 0.3374 0.6989 0.8258 0.9170 0.9476 0.8647 0.7833 0.7077 0.9143 0.7103 0.6251 0.5460
0.0543 43.9560 4000 0.3796 0.6868 0.7833 0.9165 0.9672 0.7774 0.7326 0.6558 0.9137 0.6891 0.6110 0.5335
0.0621 45.0549 4100 0.3530 0.7017 0.8086 0.9206 0.9600 0.8345 0.7633 0.6768 0.9185 0.7138 0.6284 0.5460
0.0583 46.1538 4200 0.3672 0.6901 0.8045 0.9157 0.9561 0.8251 0.7518 0.6849 0.9136 0.6983 0.6111 0.5373
0.0464 47.2527 4300 0.3885 0.6952 0.7966 0.9197 0.9629 0.8376 0.7309 0.6550 0.9181 0.7113 0.6149 0.5364
0.056 48.3516 4400 0.3900 0.6950 0.7988 0.9186 0.9618 0.8124 0.7571 0.6638 0.9161 0.7044 0.6217 0.5380
0.0673 49.4505 4500 0.3864 0.6972 0.8041 0.9191 0.9590 0.8442 0.7472 0.6658 0.9172 0.7100 0.6195 0.5420
0.0402 50.5495 4600 0.3809 0.6814 0.7824 0.9148 0.9626 0.8002 0.7315 0.6352 0.9124 0.6897 0.6041 0.5192
0.0657 51.6484 4700 0.3789 0.6942 0.7968 0.9184 0.9614 0.8354 0.7313 0.6591 0.9164 0.7042 0.6145 0.5418
0.053 52.7473 4800 0.3832 0.6972 0.8079 0.9185 0.9570 0.8385 0.7625 0.6736 0.9163 0.7077 0.6225 0.5424
0.052 53.8462 4900 0.4184 0.6940 0.7909 0.9191 0.9689 0.7788 0.7386 0.6775 0.9171 0.6958 0.6198 0.5433
0.0572 54.9451 5000 0.3882 0.6993 0.7984 0.9204 0.9635 0.8270 0.7469 0.6563 0.9182 0.7085 0.6256 0.5451
0.0364 56.0440 5100 0.4099 0.6938 0.7945 0.9183 0.9643 0.8079 0.7345 0.6715 0.9157 0.6994 0.6171 0.5429
0.0443 57.1429 5200 0.3932 0.6974 0.7955 0.9200 0.9651 0.8207 0.7340 0.6622 0.9179 0.7090 0.6181 0.5446
0.0392 58.2418 5300 0.4024 0.6981 0.8006 0.9195 0.9617 0.8292 0.7479 0.6638 0.9173 0.7089 0.6205 0.5456
0.0378 59.3407 5400 0.4137 0.6993 0.7984 0.9205 0.9641 0.8240 0.7460 0.6595 0.9181 0.7122 0.6234 0.5433
0.0603 60.4396 5500 0.4196 0.6980 0.7950 0.9204 0.9656 0.8190 0.7379 0.6577 0.9181 0.7107 0.6200 0.5433
0.0511 61.5385 5600 0.4441 0.6891 0.7772 0.9179 0.9703 0.7892 0.7124 0.6368 0.9151 0.6931 0.6085 0.5397
0.059 62.6374 5700 0.4338 0.6955 0.7906 0.9197 0.9665 0.8110 0.7354 0.6493 0.9177 0.7035 0.6185 0.5423
0.0401 63.7363 5800 0.4535 0.6923 0.7847 0.9190 0.9689 0.7952 0.7266 0.6481 0.9168 0.7007 0.6126 0.5391
0.0334 64.8352 5900 0.4412 0.6958 0.7930 0.9196 0.9657 0.8180 0.7293 0.6589 0.9176 0.7044 0.6178 0.5432
0.0603 65.9341 6000 0.4229 0.6967 0.7923 0.9199 0.9658 0.8163 0.7361 0.6507 0.9176 0.7072 0.6183 0.5437
0.0453 67.0330 6100 0.4471 0.6972 0.7939 0.9201 0.9671 0.8057 0.7344 0.6682 0.9180 0.7054 0.6183 0.5473
0.0469 68.1319 6200 0.4214 0.6989 0.8029 0.9201 0.9622 0.8315 0.7421 0.6759 0.9182 0.7093 0.6230 0.5453
0.0416 69.2308 6300 0.4360 0.6972 0.7947 0.9203 0.9640 0.8321 0.7399 0.6428 0.9183 0.7107 0.6214 0.5383
0.0658 70.3297 6400 0.4419 0.6966 0.7984 0.9192 0.9626 0.8207 0.7464 0.6641 0.9169 0.7055 0.6205 0.5436
0.0572 71.4286 6500 0.4392 0.6981 0.7993 0.9197 0.9627 0.8271 0.7430 0.6644 0.9175 0.7084 0.6206 0.5457
0.0434 72.5275 6600 0.4521 0.6969 0.7982 0.9193 0.9639 0.8106 0.7464 0.6719 0.9171 0.7034 0.6209 0.5460
0.0396 73.6264 6700 0.4341 0.6983 0.7983 0.9199 0.9628 0.8280 0.7454 0.6569 0.9176 0.7072 0.6231 0.5453
0.0483 74.7253 6800 0.4324 0.6958 0.8038 0.9181 0.9571 0.8443 0.7553 0.6586 0.9157 0.7058 0.6211 0.5408
0.0411 75.8242 6900 0.4290 0.6980 0.8069 0.9186 0.9571 0.8391 0.7655 0.6660 0.9161 0.7079 0.6243 0.5435
0.0522 76.9231 7000 0.4583 0.6962 0.7916 0.9200 0.9652 0.8235 0.7378 0.6400 0.9177 0.7074 0.6222 0.5376
0.0452 78.0220 7100 0.4529 0.6979 0.7992 0.9198 0.9624 0.8266 0.7499 0.6578 0.9176 0.7081 0.6239 0.5420
0.0412 79.1209 7200 0.4497 0.6984 0.7964 0.9203 0.9650 0.8146 0.7471 0.6590 0.9181 0.7074 0.6235 0.5446
0.0421 80.2198 7300 0.4573 0.6975 0.7946 0.9203 0.9651 0.8209 0.7407 0.6518 0.9181 0.7086 0.6215 0.5418
0.0504 81.3187 7400 0.4639 0.6973 0.7952 0.9201 0.9641 0.8224 0.7478 0.6466 0.9181 0.7083 0.6232 0.5398
0.0318 82.4176 7500 0.4477 0.6996 0.8017 0.9202 0.9625 0.8255 0.7511 0.6675 0.9182 0.7080 0.6245 0.5478
0.0497 83.5165 7600 0.4687 0.6962 0.7914 0.9197 0.9664 0.8093 0.7372 0.6529 0.9175 0.7024 0.6205 0.5442
0.0779 84.6154 7700 0.4793 0.6955 0.7895 0.9199 0.9667 0.8153 0.7313 0.6445 0.9177 0.7051 0.6187 0.5405
0.035 85.7143 7800 0.4690 0.6982 0.7966 0.9202 0.9648 0.8173 0.7441 0.6600 0.9181 0.7064 0.6231 0.5451
0.0503 86.8132 7900 0.4654 0.6975 0.7981 0.9198 0.9628 0.8272 0.7459 0.6566 0.9177 0.7074 0.6222 0.5427
0.0491 87.9121 8000 0.4481 0.6986 0.8019 0.9195 0.9608 0.8266 0.7607 0.6596 0.9172 0.7076 0.6241 0.5454
0.0347 89.0110 8100 0.4530 0.6988 0.8005 0.9199 0.9620 0.8283 0.7528 0.6588 0.9177 0.7087 0.6238 0.5449
0.0445 90.1099 8200 0.4635 0.6976 0.7962 0.9201 0.9636 0.8271 0.7450 0.6490 0.9180 0.7091 0.6227 0.5407
0.0351 91.2088 8300 0.4566 0.6983 0.8010 0.9198 0.9617 0.8299 0.7498 0.6626 0.9176 0.7089 0.6225 0.5444
0.0379 92.3077 8400 0.4639 0.6977 0.7976 0.9199 0.9634 0.8245 0.7456 0.6570 0.9178 0.7079 0.6222 0.5431
0.0432 93.4066 8500 0.4643 0.6974 0.7961 0.9200 0.9637 0.8256 0.7440 0.6514 0.9178 0.7082 0.6218 0.5417
0.0396 94.5055 8600 0.4699 0.6972 0.7951 0.9200 0.9643 0.8214 0.7431 0.6517 0.9178 0.7074 0.6215 0.5420
0.0427 95.6044 8700 0.4709 0.6973 0.7957 0.9199 0.9643 0.8199 0.7439 0.6548 0.9177 0.7075 0.6214 0.5427
0.0469 96.7033 8800 0.4671 0.6976 0.7971 0.9199 0.9635 0.8251 0.7436 0.6564 0.9177 0.7081 0.6213 0.5433
0.0421 97.8022 8900 0.4606 0.6983 0.7991 0.9199 0.9628 0.8259 0.7483 0.6595 0.9178 0.7084 0.6224 0.5444
0.0579 98.9011 9000 0.4639 0.6981 0.7987 0.9199 0.9630 0.8249 0.7472 0.6598 0.9177 0.7082 0.6221 0.5443
0.0322 100.0 9100 0.4646 0.6980 0.7985 0.9199 0.9630 0.8245 0.7482 0.6582 0.9177 0.7081 0.6223 0.5438

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
10
Safetensors
Model size
13.7M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for luoyun75579/segformer-b1-GFB

Base model

nvidia/mit-b1
Finetuned
(19)
this model