Upload README.md
Browse files
README.md
CHANGED
|
@@ -1,125 +1,222 @@
|
|
|
|
|
| 1 |
---
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
tags:
|
|
|
|
|
|
|
| 6 |
- generated_from_trainer
|
| 7 |
-
|
| 8 |
-
- accuracy
|
| 9 |
model-index:
|
| 10 |
- name: BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze
|
| 11 |
results: []
|
| 12 |
---
|
| 13 |
|
| 14 |
-
|
| 15 |
-
should probably proofread and complete it, then remove this comment. -->
|
| 16 |
|
| 17 |
-
# BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze
|
| 18 |
|
| 19 |
-
This model is a fine-tuned version of [microsoft/beit-base-patch16-384](https://huggingface.co/microsoft/beit-base-patch16-384) on the None dataset.
|
| 20 |
-
It achieves the following results on the evaluation set:
|
| 21 |
- Loss: 0.1647
|
| 22 |
- F1 Micro: 0.7446
|
| 23 |
- F1 Macro: 0.6015
|
| 24 |
- Accuracy: 0.2171
|
| 25 |
-
- Learning Rate: 0.0000
|
| 26 |
|
| 27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 28 |
|
| 29 |
-
|
| 30 |
|
| 31 |
-
|
|
|
|
| 32 |
|
| 33 |
-
|
| 34 |
|
| 35 |
-
|
|
|
|
|
|
|
| 36 |
|
| 37 |
-
|
|
|
|
| 38 |
|
| 39 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 40 |
|
| 41 |
-
|
|
|
|
|
|
|
| 42 |
|
| 43 |
The following hyperparameters were used during training:
|
| 44 |
-
|
| 45 |
-
-
|
| 46 |
-
-
|
| 47 |
-
-
|
| 48 |
-
-
|
| 49 |
-
-
|
| 50 |
-
-
|
| 51 |
-
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
|
| 57 |
-
|
| 58 |
-
|
| 59 |
-
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
|
| 63 |
-
|
| 64 |
-
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
|
| 68 |
-
|
| 69 |
-
|
| 70 |
-
|
| 71 |
-
|
| 72 |
-
|
| 73 |
-
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
| 0.
|
| 78 |
-
| 0.
|
| 79 |
-
| 0.
|
| 80 |
-
| 0.
|
| 81 |
-
| 0.
|
| 82 |
-
| 0.
|
| 83 |
-
| 0.
|
| 84 |
-
| 0.
|
| 85 |
-
| 0.
|
| 86 |
-
| 0.
|
| 87 |
-
| 0.
|
| 88 |
-
| 0.
|
| 89 |
-
| 0.
|
| 90 |
-
| 0.
|
| 91 |
-
| 0.
|
| 92 |
-
| 0.
|
| 93 |
-
| 0.
|
| 94 |
-
| 0.
|
| 95 |
-
| 0.
|
| 96 |
-
| 0.
|
| 97 |
-
| 0.
|
| 98 |
-
| 0.
|
| 99 |
-
| 0.
|
| 100 |
-
| 0.
|
| 101 |
-
| 0.
|
| 102 |
-
| 0.
|
| 103 |
-
| 0.
|
| 104 |
-
| 0.
|
| 105 |
-
| 0.
|
| 106 |
-
| 0.
|
| 107 |
-
| 0.
|
| 108 |
-
| 0.
|
| 109 |
-
| 0.
|
| 110 |
-
| 0.
|
| 111 |
-
| 0.
|
| 112 |
-
| 0.
|
| 113 |
-
| 0.
|
| 114 |
-
| 0.
|
| 115 |
-
| 0.
|
| 116 |
-
| 0.
|
| 117 |
-
| 0.
|
| 118 |
-
|
| 119 |
-
|
| 120 |
-
|
| 121 |
-
|
| 122 |
-
|
| 123 |
-
|
| 124 |
-
|
| 125 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
|
| 2 |
---
|
| 3 |
+
language:
|
| 4 |
+
- eng
|
| 5 |
+
license: cc0-1.0
|
| 6 |
tags:
|
| 7 |
+
- multilabel-image-classification
|
| 8 |
+
- multilabel
|
| 9 |
- generated_from_trainer
|
| 10 |
+
base_model: BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze
|
|
|
|
| 11 |
model-index:
|
| 12 |
- name: BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze
|
| 13 |
results: []
|
| 14 |
---
|
| 15 |
|
| 16 |
+
BeitEAU is a fine-tuned version of [BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze](https://huggingface.co/BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze). It achieves the following results on the test set:
|
|
|
|
| 17 |
|
|
|
|
| 18 |
|
|
|
|
|
|
|
| 19 |
- Loss: 0.1647
|
| 20 |
- F1 Micro: 0.7446
|
| 21 |
- F1 Macro: 0.6015
|
| 22 |
- Accuracy: 0.2171
|
|
|
|
| 23 |
|
| 24 |
+
| Class | F1 per class |
|
| 25 |
+
|----------|-------|
|
| 26 |
+
| Acropore_branched | 0.7982 |
|
| 27 |
+
| Acropore_digitised | 0.4713 |
|
| 28 |
+
| Acropore_sub_massive | 0.2880 |
|
| 29 |
+
| Acropore_tabular | 0.8900 |
|
| 30 |
+
| Algae_assembly | 0.7410 |
|
| 31 |
+
| Algae_drawn_up | 0.3889 |
|
| 32 |
+
| Algae_limestone | 0.6890 |
|
| 33 |
+
| Algae_sodding | 0.8126 |
|
| 34 |
+
| Atra/Leucospilota | 0.6085 |
|
| 35 |
+
| Bleached_coral | 0.6994 |
|
| 36 |
+
| Blurred | 0.3471 |
|
| 37 |
+
| Dead_coral | 0.6977 |
|
| 38 |
+
| Fish | 0.6206 |
|
| 39 |
+
| Homo_sapiens | 0.5546 |
|
| 40 |
+
| Human_object | 0.7174 |
|
| 41 |
+
| Living_coral | 0.6376 |
|
| 42 |
+
| Millepore | 0.6636 |
|
| 43 |
+
| No_acropore_encrusting | 0.5906 |
|
| 44 |
+
| No_acropore_foliaceous | 0.7253 |
|
| 45 |
+
| No_acropore_massive | 0.5968 |
|
| 46 |
+
| No_acropore_solitary | 0.4364 |
|
| 47 |
+
| No_acropore_sub_massive | 0.6084 |
|
| 48 |
+
| Rock | 0.8513 |
|
| 49 |
+
| Rubble | 0.7116 |
|
| 50 |
+
| Sand | 0.8955 |
|
| 51 |
+
| Sea_cucumber | 0.6009 |
|
| 52 |
+
| Sea_urchins | 0.5445 |
|
| 53 |
+
| Sponge | 0.3689 |
|
| 54 |
+
| Syringodium_isoetifolium | 0.9401 |
|
| 55 |
+
| Thalassodendron_ciliatum | 0.9547 |
|
| 56 |
+
| Useless | 0.9686 |
|
| 57 |
+
|
| 58 |
|
| 59 |
+
---
|
| 60 |
|
| 61 |
+
# Model description
|
| 62 |
+
BeitEAU is a model built on top of BeitEAU-base-patch16-384-2025_11_07_78282-bs32_freeze model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
|
| 63 |
|
| 64 |
+
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
|
| 65 |
|
| 66 |
+
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
|
| 67 |
+
|
| 68 |
+
---
|
| 69 |
|
| 70 |
+
# Intended uses & limitations
|
| 71 |
+
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
|
| 72 |
|
| 73 |
+
---
|
| 74 |
+
|
| 75 |
+
# Training and evaluation data
|
| 76 |
+
Details on the number of images for each class are given in the following table:
|
| 77 |
+
| Class | train | test | val | Total |
|
| 78 |
+
|:-------------------------|--------:|-------:|------:|--------:|
|
| 79 |
+
| Acropore_branched | 1480 | 469 | 459 | 2408 |
|
| 80 |
+
| Acropore_digitised | 571 | 156 | 161 | 888 |
|
| 81 |
+
| Acropore_sub_massive | 150 | 52 | 41 | 243 |
|
| 82 |
+
| Acropore_tabular | 999 | 292 | 298 | 1589 |
|
| 83 |
+
| Algae_assembly | 2554 | 842 | 842 | 4238 |
|
| 84 |
+
| Algae_drawn_up | 367 | 130 | 123 | 620 |
|
| 85 |
+
| Algae_limestone | 1651 | 562 | 559 | 2772 |
|
| 86 |
+
| Algae_sodding | 3142 | 994 | 981 | 5117 |
|
| 87 |
+
| Atra/Leucospilota | 1084 | 349 | 359 | 1792 |
|
| 88 |
+
| Bleached_coral | 219 | 69 | 72 | 360 |
|
| 89 |
+
| Blurred | 191 | 68 | 61 | 320 |
|
| 90 |
+
| Dead_coral | 1980 | 648 | 636 | 3264 |
|
| 91 |
+
| Fish | 2018 | 661 | 642 | 3321 |
|
| 92 |
+
| Homo_sapiens | 161 | 63 | 58 | 282 |
|
| 93 |
+
| Human_object | 156 | 55 | 59 | 270 |
|
| 94 |
+
| Living_coral | 397 | 151 | 153 | 701 |
|
| 95 |
+
| Millepore | 386 | 127 | 124 | 637 |
|
| 96 |
+
| No_acropore_encrusting | 442 | 141 | 142 | 725 |
|
| 97 |
+
| No_acropore_foliaceous | 204 | 47 | 35 | 286 |
|
| 98 |
+
| No_acropore_massive | 1030 | 341 | 334 | 1705 |
|
| 99 |
+
| No_acropore_solitary | 202 | 55 | 46 | 303 |
|
| 100 |
+
| No_acropore_sub_massive | 1402 | 428 | 426 | 2256 |
|
| 101 |
+
| Rock | 4481 | 1495 | 1481 | 7457 |
|
| 102 |
+
| Rubble | 3092 | 1015 | 1016 | 5123 |
|
| 103 |
+
| Sand | 5839 | 1945 | 1935 | 9719 |
|
| 104 |
+
| Sea_cucumber | 1407 | 437 | 450 | 2294 |
|
| 105 |
+
| Sea_urchins | 328 | 110 | 107 | 545 |
|
| 106 |
+
| Sponge | 267 | 98 | 105 | 470 |
|
| 107 |
+
| Syringodium_isoetifolium | 1213 | 392 | 390 | 1995 |
|
| 108 |
+
| Thalassodendron_ciliatum | 781 | 262 | 260 | 1303 |
|
| 109 |
+
| Useless | 579 | 193 | 193 | 965 |
|
| 110 |
+
|
| 111 |
+
---
|
| 112 |
|
| 113 |
+
# Training procedure
|
| 114 |
+
|
| 115 |
+
## Training hyperparameters
|
| 116 |
|
| 117 |
The following hyperparameters were used during training:
|
| 118 |
+
|
| 119 |
+
- **Number of Epochs**: 61.0
|
| 120 |
+
- **Learning Rate**: 0.001
|
| 121 |
+
- **Train Batch Size**: 32
|
| 122 |
+
- **Eval Batch Size**: 32
|
| 123 |
+
- **Optimizer**: Adam
|
| 124 |
+
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
|
| 125 |
+
- **Freeze Encoder**: Yes
|
| 126 |
+
- **Data Augmentation**: Yes
|
| 127 |
+
|
| 128 |
+
|
| 129 |
+
## Data Augmentation
|
| 130 |
+
Data were augmented using the following transformations :
|
| 131 |
+
|
| 132 |
+
Train Transforms
|
| 133 |
+
- **PreProcess**: No additional parameters
|
| 134 |
+
- **Resize**: probability=1.00
|
| 135 |
+
- **RandomHorizontalFlip**: probability=0.25
|
| 136 |
+
- **RandomVerticalFlip**: probability=0.25
|
| 137 |
+
- **ColorJiggle**: probability=0.25
|
| 138 |
+
- **RandomPerspective**: probability=0.25
|
| 139 |
+
- **Normalize**: probability=1.00
|
| 140 |
+
|
| 141 |
+
Val Transforms
|
| 142 |
+
- **PreProcess**: No additional parameters
|
| 143 |
+
- **Resize**: probability=1.00
|
| 144 |
+
- **Normalize**: probability=1.00
|
| 145 |
+
|
| 146 |
+
|
| 147 |
+
|
| 148 |
+
## Training results
|
| 149 |
+
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
|
| 150 |
+
--- | --- | --- | --- | --- | ---
|
| 151 |
+
1 | 0.2007167786359787 | 0.1609 | 0.6701 | 0.4414 | 0.001
|
| 152 |
+
2 | 0.18952292203903198 | 0.1825 | 0.7090 | 0.5352 | 0.001
|
| 153 |
+
3 | 0.18667148053646088 | 0.1763 | 0.7166 | 0.5614 | 0.001
|
| 154 |
+
4 | 0.1814328134059906 | 0.1798 | 0.7215 | 0.5564 | 0.001
|
| 155 |
+
5 | 0.18154974281787872 | 0.1923 | 0.7243 | 0.5929 | 0.001
|
| 156 |
+
6 | 0.17931246757507324 | 0.1962 | 0.7348 | 0.5779 | 0.001
|
| 157 |
+
7 | 0.17549261450767517 | 0.1997 | 0.7318 | 0.5828 | 0.001
|
| 158 |
+
8 | 0.17576082050800323 | 0.2059 | 0.7272 | 0.5792 | 0.001
|
| 159 |
+
9 | 0.1740318238735199 | 0.1937 | 0.7299 | 0.5864 | 0.001
|
| 160 |
+
10 | 0.17477667331695557 | 0.1874 | 0.7276 | 0.5768 | 0.001
|
| 161 |
+
11 | 0.17347504198551178 | 0.1979 | 0.7381 | 0.6032 | 0.001
|
| 162 |
+
12 | 0.17222067713737488 | 0.2112 | 0.7353 | 0.5857 | 0.001
|
| 163 |
+
13 | 0.1720178872346878 | 0.2161 | 0.7369 | 0.5801 | 0.001
|
| 164 |
+
14 | 0.17544052004814148 | 0.1923 | 0.7266 | 0.5724 | 0.001
|
| 165 |
+
15 | 0.1705772429704666 | 0.1986 | 0.7413 | 0.5973 | 0.001
|
| 166 |
+
16 | 0.17385560274124146 | 0.2059 | 0.7282 | 0.6014 | 0.001
|
| 167 |
+
17 | 0.17162470519542694 | 0.1997 | 0.7463 | 0.6158 | 0.001
|
| 168 |
+
18 | 0.17156463861465454 | 0.2042 | 0.7337 | 0.5882 | 0.001
|
| 169 |
+
19 | 0.174124613404274 | 0.2094 | 0.7237 | 0.5876 | 0.001
|
| 170 |
+
20 | 0.1717967540025711 | 0.2094 | 0.7294 | 0.5808 | 0.001
|
| 171 |
+
21 | 0.1721898466348648 | 0.1986 | 0.7369 | 0.5918 | 0.001
|
| 172 |
+
22 | 0.16664884984493256 | 0.2115 | 0.7486 | 0.6174 | 0.0001
|
| 173 |
+
23 | 0.16572968661785126 | 0.2059 | 0.7474 | 0.6164 | 0.0001
|
| 174 |
+
24 | 0.16604110598564148 | 0.2077 | 0.7473 | 0.6207 | 0.0001
|
| 175 |
+
25 | 0.16543170809745789 | 0.2119 | 0.7474 | 0.6136 | 0.0001
|
| 176 |
+
26 | 0.16543741524219513 | 0.2098 | 0.7498 | 0.6169 | 0.0001
|
| 177 |
+
27 | 0.1656235158443451 | 0.2115 | 0.7497 | 0.6185 | 0.0001
|
| 178 |
+
28 | 0.16566258668899536 | 0.2101 | 0.7455 | 0.6091 | 0.0001
|
| 179 |
+
29 | 0.1653388887643814 | 0.2077 | 0.7467 | 0.6130 | 0.0001
|
| 180 |
+
30 | 0.16557003557682037 | 0.2119 | 0.7484 | 0.6168 | 0.0001
|
| 181 |
+
31 | 0.16546830534934998 | 0.2115 | 0.7498 | 0.6121 | 0.0001
|
| 182 |
+
32 | 0.1653573364019394 | 0.2073 | 0.7466 | 0.6116 | 0.0001
|
| 183 |
+
33 | 0.16520579159259796 | 0.2147 | 0.7470 | 0.6166 | 0.0001
|
| 184 |
+
34 | 0.1652197241783142 | 0.2188 | 0.7467 | 0.6174 | 0.0001
|
| 185 |
+
35 | 0.16497600078582764 | 0.2115 | 0.7503 | 0.6175 | 0.0001
|
| 186 |
+
36 | 0.16519583761692047 | 0.2105 | 0.7479 | 0.6171 | 0.0001
|
| 187 |
+
37 | 0.1649632453918457 | 0.2154 | 0.7489 | 0.6121 | 0.0001
|
| 188 |
+
38 | 0.16533540189266205 | 0.2164 | 0.7489 | 0.6105 | 0.0001
|
| 189 |
+
39 | 0.16515697538852692 | 0.2164 | 0.7506 | 0.6167 | 0.0001
|
| 190 |
+
40 | 0.16513320803642273 | 0.2140 | 0.7511 | 0.6164 | 0.0001
|
| 191 |
+
41 | 0.16498203575611115 | 0.2129 | 0.7508 | 0.6149 | 0.0001
|
| 192 |
+
42 | 0.16477563977241516 | 0.2136 | 0.7512 | 0.6145 | 1e-05
|
| 193 |
+
43 | 0.16469572484493256 | 0.2122 | 0.7490 | 0.6136 | 1e-05
|
| 194 |
+
44 | 0.16464821994304657 | 0.2126 | 0.7502 | 0.6148 | 1e-05
|
| 195 |
+
45 | 0.16469135880470276 | 0.2140 | 0.7512 | 0.6161 | 1e-05
|
| 196 |
+
46 | 0.16468331217765808 | 0.2126 | 0.7511 | 0.6148 | 1e-05
|
| 197 |
+
47 | 0.16469669342041016 | 0.2136 | 0.7510 | 0.6149 | 1e-05
|
| 198 |
+
48 | 0.16463501751422882 | 0.2126 | 0.7500 | 0.6143 | 1e-05
|
| 199 |
+
49 | 0.16467586159706116 | 0.2126 | 0.7508 | 0.6153 | 1e-05
|
| 200 |
+
50 | 0.16459208726882935 | 0.2126 | 0.7494 | 0.6139 | 1e-05
|
| 201 |
+
51 | 0.1645309180021286 | 0.2129 | 0.7498 | 0.6143 | 1e-05
|
| 202 |
+
52 | 0.16459544003009796 | 0.2133 | 0.7504 | 0.6142 | 1e-05
|
| 203 |
+
53 | 0.16463778913021088 | 0.2122 | 0.7505 | 0.6146 | 1e-05
|
| 204 |
+
54 | 0.16458478569984436 | 0.2129 | 0.7504 | 0.6158 | 1e-05
|
| 205 |
+
55 | 0.16461443901062012 | 0.2119 | 0.7505 | 0.6153 | 1e-05
|
| 206 |
+
56 | 0.1645844429731369 | 0.2115 | 0.7504 | 0.6145 | 1e-05
|
| 207 |
+
57 | 0.16459982097148895 | 0.2115 | 0.7501 | 0.6140 | 1e-05
|
| 208 |
+
58 | 0.16459773480892181 | 0.2115 | 0.7500 | 0.6138 | 1.0000000000000002e-06
|
| 209 |
+
59 | 0.164586141705513 | 0.2126 | 0.7502 | 0.6139 | 1.0000000000000002e-06
|
| 210 |
+
60 | 0.16457903385162354 | 0.2126 | 0.7502 | 0.6139 | 1.0000000000000002e-06
|
| 211 |
+
61 | 0.1645755171775818 | 0.2126 | 0.7505 | 0.6143 | 1.0000000000000002e-06
|
| 212 |
+
|
| 213 |
+
|
| 214 |
+
---
|
| 215 |
+
|
| 216 |
+
# Framework Versions
|
| 217 |
+
|
| 218 |
+
- **Transformers**: 4.56.0.dev0
|
| 219 |
+
- **Pytorch**: 2.6.0+cu124
|
| 220 |
+
- **Datasets**: 3.0.2
|
| 221 |
+
- **Tokenizers**: 0.21.0
|
| 222 |
+
|