matthieulel commited on
Commit
820705f
·
verified ·
1 Parent(s): 3f07377

Model save

Browse files
Files changed (1) hide show
  1. README.md +98 -0
README.md ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/dinov2-large
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ - precision
9
+ - recall
10
+ - f1
11
+ model-index:
12
+ - name: dinov2-large-finetuned-galaxy10-decals
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # dinov2-large-finetuned-galaxy10-decals
20
+
21
+ This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on an unknown dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.5869
24
+ - Accuracy: 0.8737
25
+ - Precision: 0.8722
26
+ - Recall: 0.8737
27
+ - F1: 0.8722
28
+
29
+ ## Model description
30
+
31
+ More information needed
32
+
33
+ ## Intended uses & limitations
34
+
35
+ More information needed
36
+
37
+ ## Training and evaluation data
38
+
39
+ More information needed
40
+
41
+ ## Training procedure
42
+
43
+ ### Training hyperparameters
44
+
45
+ The following hyperparameters were used during training:
46
+ - learning_rate: 5e-05
47
+ - train_batch_size: 64
48
+ - eval_batch_size: 64
49
+ - seed: 42
50
+ - gradient_accumulation_steps: 4
51
+ - total_train_batch_size: 256
52
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
+ - lr_scheduler_type: linear
54
+ - lr_scheduler_warmup_ratio: 0.1
55
+ - num_epochs: 30
56
+
57
+ ### Training results
58
+
59
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
60
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
61
+ | 0.7564 | 0.99 | 62 | 0.6187 | 0.7976 | 0.8171 | 0.7976 | 0.7990 |
62
+ | 0.7766 | 2.0 | 125 | 0.6102 | 0.7852 | 0.8052 | 0.7852 | 0.7782 |
63
+ | 0.7103 | 2.99 | 187 | 0.5744 | 0.8089 | 0.8140 | 0.8089 | 0.8032 |
64
+ | 0.6704 | 4.0 | 250 | 0.6859 | 0.7745 | 0.7899 | 0.7745 | 0.7663 |
65
+ | 0.599 | 4.99 | 312 | 0.4729 | 0.8377 | 0.8412 | 0.8377 | 0.8359 |
66
+ | 0.565 | 6.0 | 375 | 0.4465 | 0.8517 | 0.8542 | 0.8517 | 0.8507 |
67
+ | 0.5576 | 6.99 | 437 | 0.4479 | 0.8484 | 0.8565 | 0.8484 | 0.8452 |
68
+ | 0.4966 | 8.0 | 500 | 0.4870 | 0.8388 | 0.8399 | 0.8388 | 0.8363 |
69
+ | 0.4667 | 8.99 | 562 | 0.4763 | 0.8444 | 0.8496 | 0.8444 | 0.8443 |
70
+ | 0.4264 | 10.0 | 625 | 0.4802 | 0.8377 | 0.8378 | 0.8377 | 0.8324 |
71
+ | 0.445 | 10.99 | 687 | 0.5246 | 0.8377 | 0.8383 | 0.8377 | 0.8343 |
72
+ | 0.3935 | 12.0 | 750 | 0.4883 | 0.8439 | 0.8519 | 0.8439 | 0.8434 |
73
+ | 0.374 | 12.99 | 812 | 0.4511 | 0.8568 | 0.8603 | 0.8568 | 0.8569 |
74
+ | 0.3551 | 14.0 | 875 | 0.5153 | 0.8546 | 0.8517 | 0.8546 | 0.8496 |
75
+ | 0.3573 | 14.99 | 937 | 0.4705 | 0.8579 | 0.8554 | 0.8579 | 0.8559 |
76
+ | 0.3385 | 16.0 | 1000 | 0.4547 | 0.8517 | 0.8535 | 0.8517 | 0.8517 |
77
+ | 0.2764 | 16.99 | 1062 | 0.5189 | 0.8529 | 0.8544 | 0.8529 | 0.8513 |
78
+ | 0.2895 | 18.0 | 1125 | 0.5393 | 0.8602 | 0.8587 | 0.8602 | 0.8586 |
79
+ | 0.2738 | 18.99 | 1187 | 0.5554 | 0.8405 | 0.8436 | 0.8405 | 0.8381 |
80
+ | 0.2563 | 20.0 | 1250 | 0.5478 | 0.8608 | 0.8573 | 0.8608 | 0.8574 |
81
+ | 0.2375 | 20.99 | 1312 | 0.5512 | 0.8664 | 0.8651 | 0.8664 | 0.8622 |
82
+ | 0.2599 | 22.0 | 1375 | 0.5317 | 0.8625 | 0.8607 | 0.8625 | 0.8599 |
83
+ | 0.2146 | 22.99 | 1437 | 0.5972 | 0.8568 | 0.8567 | 0.8568 | 0.8559 |
84
+ | 0.2132 | 24.0 | 1500 | 0.5934 | 0.8636 | 0.8617 | 0.8636 | 0.8606 |
85
+ | 0.2036 | 24.99 | 1562 | 0.5923 | 0.8664 | 0.8662 | 0.8664 | 0.8658 |
86
+ | 0.1971 | 26.0 | 1625 | 0.5839 | 0.8630 | 0.8621 | 0.8630 | 0.8621 |
87
+ | 0.1878 | 26.99 | 1687 | 0.5907 | 0.8625 | 0.8669 | 0.8625 | 0.8640 |
88
+ | 0.1922 | 28.0 | 1750 | 0.6058 | 0.8692 | 0.8684 | 0.8692 | 0.8680 |
89
+ | 0.1854 | 28.99 | 1812 | 0.6014 | 0.8670 | 0.8653 | 0.8670 | 0.8655 |
90
+ | 0.1688 | 29.76 | 1860 | 0.5869 | 0.8737 | 0.8722 | 0.8737 | 0.8722 |
91
+
92
+
93
+ ### Framework versions
94
+
95
+ - Transformers 4.37.2
96
+ - Pytorch 2.3.0
97
+ - Datasets 2.19.1
98
+ - Tokenizers 0.15.1