parlange commited on
Commit
c82c0c4
·
verified ·
1 Parent(s): 8a196a3

Upload CvT model from experiment c2

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +2 -0
  2. README.md +166 -0
  3. config.json +76 -0
  4. confusion_matrices/CvT_Confusion_Matrix_a.png +0 -0
  5. confusion_matrices/CvT_Confusion_Matrix_b.png +0 -0
  6. confusion_matrices/CvT_Confusion_Matrix_c.png +0 -0
  7. confusion_matrices/CvT_Confusion_Matrix_d.png +0 -0
  8. confusion_matrices/CvT_Confusion_Matrix_e.png +0 -0
  9. confusion_matrices/CvT_Confusion_Matrix_f.png +0 -0
  10. confusion_matrices/CvT_Confusion_Matrix_g.png +0 -0
  11. confusion_matrices/CvT_Confusion_Matrix_h.png +0 -0
  12. confusion_matrices/CvT_Confusion_Matrix_i.png +0 -0
  13. confusion_matrices/CvT_Confusion_Matrix_j.png +0 -0
  14. confusion_matrices/CvT_Confusion_Matrix_k.png +0 -0
  15. confusion_matrices/CvT_Confusion_Matrix_l.png +0 -0
  16. cvt-gravit-c2.pth +3 -0
  17. evaluation_results.csv +133 -0
  18. model.safetensors +3 -0
  19. pytorch_model.bin +3 -0
  20. roc_confusion_matrix/CvT_roc_confusion_matrix_a.png +0 -0
  21. roc_confusion_matrix/CvT_roc_confusion_matrix_b.png +0 -0
  22. roc_confusion_matrix/CvT_roc_confusion_matrix_c.png +0 -0
  23. roc_confusion_matrix/CvT_roc_confusion_matrix_d.png +0 -0
  24. roc_confusion_matrix/CvT_roc_confusion_matrix_e.png +0 -0
  25. roc_confusion_matrix/CvT_roc_confusion_matrix_f.png +0 -0
  26. roc_confusion_matrix/CvT_roc_confusion_matrix_g.png +0 -0
  27. roc_confusion_matrix/CvT_roc_confusion_matrix_h.png +0 -0
  28. roc_confusion_matrix/CvT_roc_confusion_matrix_i.png +0 -0
  29. roc_confusion_matrix/CvT_roc_confusion_matrix_j.png +0 -0
  30. roc_confusion_matrix/CvT_roc_confusion_matrix_k.png +0 -0
  31. roc_confusion_matrix/CvT_roc_confusion_matrix_l.png +0 -0
  32. roc_curves/CvT_ROC_a.png +0 -0
  33. roc_curves/CvT_ROC_b.png +0 -0
  34. roc_curves/CvT_ROC_c.png +0 -0
  35. roc_curves/CvT_ROC_d.png +0 -0
  36. roc_curves/CvT_ROC_e.png +0 -0
  37. roc_curves/CvT_ROC_f.png +0 -0
  38. roc_curves/CvT_ROC_g.png +0 -0
  39. roc_curves/CvT_ROC_h.png +0 -0
  40. roc_curves/CvT_ROC_i.png +0 -0
  41. roc_curves/CvT_ROC_j.png +0 -0
  42. roc_curves/CvT_ROC_k.png +0 -0
  43. roc_curves/CvT_ROC_l.png +0 -0
  44. training_curves/CvT_accuracy.png +0 -0
  45. training_curves/CvT_auc.png +0 -0
  46. training_curves/CvT_combined_metrics.png +3 -0
  47. training_curves/CvT_f1.png +0 -0
  48. training_curves/CvT_loss.png +0 -0
  49. training_curves/CvT_metrics.csv +47 -0
  50. training_metrics.csv +47 -0
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ training_curves/CvT_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
37
+ training_notebook_c2.ipynb filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,166 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - image-classification
5
+ - pytorch
6
+ - timm
7
+ - cvt
8
+ - vision-transformer
9
+ - transformer
10
+ - gravitational-lensing
11
+ - strong-lensing
12
+ - astronomy
13
+ - astrophysics
14
+ datasets:
15
+ - parlange/gravit-c21-j24
16
+ metrics:
17
+ - accuracy
18
+ - auc
19
+ - f1
20
+ paper:
21
+ - title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
22
+ url: "https://arxiv.org/abs/2509.00226"
23
+ authors: "Parlange et al."
24
+ model-index:
25
+ - name: CvT-c2
26
+ results:
27
+ - task:
28
+ type: image-classification
29
+ name: Strong Gravitational Lens Discovery
30
+ dataset:
31
+ type: common-test-sample
32
+ name: Common Test Sample (More et al. 2024)
33
+ metrics:
34
+ - type: accuracy
35
+ value: 0.6983
36
+ name: Average Accuracy
37
+ - type: auc
38
+ value: 0.7470
39
+ name: Average AUC-ROC
40
+ - type: f1
41
+ value: 0.4396
42
+ name: Average F1-Score
43
+ ---
44
+
45
+ # 🌌 cvt-gravit-c2
46
+
47
+ 🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
48
+
49
+ 🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
50
+
51
+ ## 🛰️ Model Details
52
+
53
+ - **🤖 Model Type**: CvT
54
+ - **🧪 Experiment**: C2 - C21+J24-half
55
+ - **🌌 Dataset**: C21+J24
56
+ - **🪐 Fine-tuning Strategy**: half
57
+
58
+
59
+
60
+ ## 💻 Quick Start
61
+
62
+ ```python
63
+ import torch
64
+ import timm
65
+
66
+ # Load the model directly from the Hub
67
+ model = timm.create_model(
68
+ 'hf-hub:parlange/cvt-gravit-c2',
69
+ pretrained=True
70
+ )
71
+ model.eval()
72
+
73
+ # Example inference
74
+ dummy_input = torch.randn(1, 3, 224, 224)
75
+ with torch.no_grad():
76
+ output = model(dummy_input)
77
+ predictions = torch.softmax(output, dim=1)
78
+ print(f"Lens probability: {predictions[0][1]:.4f}")
79
+ ```
80
+
81
+ ## ⚡️ Training Configuration
82
+
83
+ **Training Dataset:** C21+J24 (Cañameras et al. 2021 + Jaelani et al. 2024)
84
+ **Fine-tuning Strategy:** half
85
+
86
+
87
+ | 🔧 Parameter | 📝 Value |
88
+ |--------------|----------|
89
+ | Batch Size | 192 |
90
+ | Learning Rate | AdamW with ReduceLROnPlateau |
91
+ | Epochs | 100 |
92
+ | Patience | 10 |
93
+ | Optimizer | AdamW |
94
+ | Scheduler | ReduceLROnPlateau |
95
+ | Image Size | 224x224 |
96
+ | Fine Tune Mode | half |
97
+ | Stochastic Depth Probability | 0.1 |
98
+
99
+
100
+ ## 📈 Training Curves
101
+
102
+ ![Combined Training Metrics](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/training_curves/CvT_combined_metrics.png)
103
+
104
+
105
+ ## 🏁 Final Epoch Training Metrics
106
+
107
+ | Metric | Training | Validation |
108
+ |:---------:|:-----------:|:-------------:|
109
+ | 📉 Loss | 0.3623 | 0.3242 |
110
+ | 🎯 Accuracy | 0.8036 | 0.8513 |
111
+ | 📊 AUC-ROC | 0.9105 | 0.9527 |
112
+ | ⚖️ F1 Score | 0.8057 | 0.8623 |
113
+
114
+
115
+ ## ☑️ Evaluation Results
116
+
117
+ ### ROC Curves and Confusion Matrices
118
+
119
+ Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
120
+
121
+ ![ROC + Confusion Matrix - Dataset A](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_a.png)
122
+ ![ROC + Confusion Matrix - Dataset B](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_b.png)
123
+ ![ROC + Confusion Matrix - Dataset C](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_c.png)
124
+ ![ROC + Confusion Matrix - Dataset D](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_d.png)
125
+ ![ROC + Confusion Matrix - Dataset E](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_e.png)
126
+ ![ROC + Confusion Matrix - Dataset F](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_f.png)
127
+ ![ROC + Confusion Matrix - Dataset G](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_g.png)
128
+ ![ROC + Confusion Matrix - Dataset H](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_h.png)
129
+ ![ROC + Confusion Matrix - Dataset I](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_i.png)
130
+ ![ROC + Confusion Matrix - Dataset J](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_j.png)
131
+ ![ROC + Confusion Matrix - Dataset K](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_k.png)
132
+ ![ROC + Confusion Matrix - Dataset L](https://huggingface.co/parlange/cvt-gravit-c2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_l.png)
133
+
134
+ ### 📋 Performance Summary
135
+
136
+ Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
137
+
138
+ | Metric | Value |
139
+ |-----------|----------|
140
+ | 🎯 Average Accuracy | 0.6983 |
141
+ | 📈 Average AUC-ROC | 0.7470 |
142
+ | ⚖️ Average F1-Score | 0.4396 |
143
+
144
+
145
+ ## 📘 Citation
146
+
147
+ If you use this model in your research, please cite:
148
+
149
+ ```bibtex
150
+ @misc{parlange2025gravit,
151
+ title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
152
+ author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
153
+ year={2025},
154
+ eprint={2509.00226},
155
+ archivePrefix={arXiv},
156
+ primaryClass={cs.CV},
157
+ url={https://arxiv.org/abs/2509.00226},
158
+ }
159
+ ```
160
+
161
+ ---
162
+
163
+
164
+ ## Model Card Contact
165
+
166
+ For questions about this model, please contact the author through: https://github.com/parlange/
config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architecture": "cvt_13_224",
3
+ "num_classes": 2,
4
+ "num_features": 1000,
5
+ "global_pool": "avg",
6
+ "crop_pct": 0.875,
7
+ "interpolation": "bicubic",
8
+ "mean": [
9
+ 0.485,
10
+ 0.456,
11
+ 0.406
12
+ ],
13
+ "std": [
14
+ 0.229,
15
+ 0.224,
16
+ 0.225
17
+ ],
18
+ "first_conv": "conv1",
19
+ "classifier": "fc",
20
+ "input_size": [
21
+ 3,
22
+ 224,
23
+ 224
24
+ ],
25
+ "pool_size": [
26
+ 7,
27
+ 7
28
+ ],
29
+ "pretrained_cfg": {
30
+ "tag": "gravit_c2",
31
+ "custom_load": false,
32
+ "input_size": [
33
+ 3,
34
+ 224,
35
+ 224
36
+ ],
37
+ "fixed_input_size": true,
38
+ "interpolation": "bicubic",
39
+ "crop_pct": 0.875,
40
+ "crop_mode": "center",
41
+ "mean": [
42
+ 0.485,
43
+ 0.456,
44
+ 0.406
45
+ ],
46
+ "std": [
47
+ 0.229,
48
+ 0.224,
49
+ 0.225
50
+ ],
51
+ "num_classes": 2,
52
+ "pool_size": [
53
+ 7,
54
+ 7
55
+ ],
56
+ "first_conv": "conv1",
57
+ "classifier": "fc"
58
+ },
59
+ "model_name": "cvt_gravit_c2",
60
+ "experiment": "c2",
61
+ "training_strategy": "half",
62
+ "dataset": "C21+J24",
63
+ "hyperparameters": {
64
+ "batch_size": "192",
65
+ "learning_rate": "AdamW with ReduceLROnPlateau",
66
+ "epochs": "100",
67
+ "patience": "10",
68
+ "optimizer": "AdamW",
69
+ "scheduler": "ReduceLROnPlateau",
70
+ "image_size": "224x224",
71
+ "fine_tune_mode": "half",
72
+ "stochastic_depth_probability": "0.1"
73
+ },
74
+ "hf_hub_id": "parlange/cvt-gravit-c2",
75
+ "license": "apache-2.0"
76
+ }
confusion_matrices/CvT_Confusion_Matrix_a.png ADDED
confusion_matrices/CvT_Confusion_Matrix_b.png ADDED
confusion_matrices/CvT_Confusion_Matrix_c.png ADDED
confusion_matrices/CvT_Confusion_Matrix_d.png ADDED
confusion_matrices/CvT_Confusion_Matrix_e.png ADDED
confusion_matrices/CvT_Confusion_Matrix_f.png ADDED
confusion_matrices/CvT_Confusion_Matrix_g.png ADDED
confusion_matrices/CvT_Confusion_Matrix_h.png ADDED
confusion_matrices/CvT_Confusion_Matrix_i.png ADDED
confusion_matrices/CvT_Confusion_Matrix_j.png ADDED
confusion_matrices/CvT_Confusion_Matrix_k.png ADDED
confusion_matrices/CvT_Confusion_Matrix_l.png ADDED
cvt-gravit-c2.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e003815d79d6f2a95583e2f46e27a357d42c225b8e9b6448e6464767bca64c9d
3
+ size 125471131
evaluation_results.csv ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Model,Dataset,Loss,Accuracy,AUCROC,F1
2
+ ViT,a,0.35447569389258865,0.8949115044247787,0.9020846228498507,0.7480106100795756
3
+ ViT,b,0.2228443425890036,0.9264382269726501,0.9263609576427256,0.5465116279069767
4
+ ViT,c,0.46229862214933587,0.8349575605155611,0.8684438305709024,0.34944237918215615
5
+ ViT,d,0.11673173789463115,0.9537881169443572,0.9704917127071824,0.6573426573426573
6
+ ViT,e,0.3652098159562351,0.8825466520307355,0.920767426019829,0.7249357326478149
7
+ ViT,f,0.24608832064126923,0.9108519842016175,0.9203345361214339,0.22926829268292684
8
+ ViT,g,0.10483654439449311,0.9635,0.997473,0.9644999189495866
9
+ ViT,h,0.23178722894191742,0.915,0.9939590555555555,0.9210526315789473
10
+ ViT,i,0.04857917896906535,0.978,0.9990572222222223,0.9782966129562644
11
+ ViT,j,2.494326035181681,0.6106666666666667,0.5831323333333334,0.42349457058242845
12
+ ViT,k,2.4380686638752618,0.6251666666666666,0.7805802777777779,0.43278688524590164
13
+ ViT,l,1.0272723743838732,0.8127329565949261,0.7993805230717175,0.7184308053873272
14
+ MLP-Mixer,a,1.230455079964832,0.6227876106194691,0.8958911227772556,0.49028400597907323
15
+ MLP-Mixer,b,1.0728926989350893,0.7004086765168186,0.9182900552486188,0.25604996096799376
16
+ MLP-Mixer,c,1.374837134027586,0.5576862621817039,0.8979152854511969,0.18904899135446687
17
+ MLP-Mixer,d,0.09552026474693218,0.9603898145237346,0.9868913443830571,0.7224669603524229
18
+ MLP-Mixer,e,0.9593323631422711,0.7069154774972558,0.9188677817301143,0.5512605042016807
19
+ MLP-Mixer,f,0.9257462782946794,0.7154410381794245,0.9306221006103087,0.09779367918902802
20
+ MLP-Mixer,g,0.5643243643840155,0.8425,0.991425611111111,0.8635773061931572
21
+ MLP-Mixer,h,0.7244052359660467,0.7668333333333334,0.9891666111111111,0.8104592873594364
22
+ MLP-Mixer,i,0.04615406060218811,0.9803333333333333,0.9994367777777778,0.980655737704918
23
+ MLP-Mixer,j,3.0292422666549683,0.45216666666666666,0.392282,0.28309705561613957
24
+ MLP-Mixer,k,2.5110719747940697,0.59,0.7661271111111111,0.3453964874933475
25
+ MLP-Mixer,l,1.4846716919555334,0.6762053625105207,0.7295511702036557,0.5855010004617516
26
+ CvT,a,0.7465745627352621,0.6493362831858407,0.7317079694031161,0.4389380530973451
27
+ CvT,b,0.7336456650122649,0.6765168186104998,0.7552670349907918,0.1942051683633516
28
+ CvT,c,0.8642418710588097,0.5919522162841874,0.6964806629834255,0.16041397153945666
29
+ CvT,d,0.06205783033066015,0.9761081420936812,0.9876427255985267,0.7654320987654321
30
+ CvT,e,0.6019917449757506,0.7178924259055982,0.7936123514720351,0.4910891089108911
31
+ CvT,f,0.5685286294680824,0.7414895617829603,0.8061353821076506,0.08274941608274941
32
+ CvT,g,0.4509977758725484,0.8055,0.9201512777777776,0.8277999114652501
33
+ CvT,h,0.5202355206807454,0.7606666666666667,0.9072719444444444,0.7961964235026966
34
+ CvT,i,0.09494428576032321,0.9643333333333334,0.9977035555555557,0.9632554945054945
35
+ CvT,j,2.988422914981842,0.3456666666666667,0.14668444444444442,0.022896963663514187
36
+ CvT,k,2.6323694267769655,0.5045,0.6181494444444444,0.0300163132137031
37
+ CvT,l,1.337245315202257,0.645425033064807,0.6032419706344807,0.5021944632005402
38
+ Swin,a,0.47572549887463056,0.8407079646017699,0.905882487792577,0.6742081447963801
39
+ Swin,b,0.24361524523634911,0.9163784973278843,0.9362615101289135,0.5283687943262412
40
+ Swin,c,0.4370936370240709,0.8535051870480981,0.9087605893186003,0.3900523560209424
41
+ Swin,d,0.038348094671021904,0.9880540710468406,0.9911620626151013,0.8869047619047619
42
+ Swin,e,0.3579506372581067,0.8781558726673985,0.9260273972602739,0.7286063569682152
43
+ Swin,f,0.24650774364781286,0.9156479217603912,0.9413092437445593,0.24937238493723848
44
+ Swin,g,0.11494702147444089,0.9593333333333334,0.9989898888888888,0.9607969151670951
45
+ Swin,h,0.2175228010714054,0.926,0.9979807777777777,0.9308841843088418
46
+ Swin,i,0.006121216081082821,0.9973333333333333,0.9999798888888889,0.9973315543695798
47
+ Swin,j,2.5422419211069744,0.5825,0.4893003333333333,0.3679031037093111
48
+ Swin,k,2.433416116627554,0.6205,0.7913794999999999,0.39036144578313253
49
+ Swin,l,1.035569912688268,0.8089455332451605,0.7797953948083542,0.7088143668682426
50
+ CaiT,a,0.3509529214517205,0.9081858407079646,0.8966973093999068,0.7726027397260274
51
+ CaiT,b,0.1907231829655279,0.9380697893744105,0.9234548802946593,0.5887265135699373
52
+ CaiT,c,0.3048490960337163,0.90883370009431,0.8791160220994475,0.493006993006993
53
+ CaiT,d,0.06549901952829443,0.9849104055328513,0.969243093922652,0.8545454545454545
54
+ CaiT,e,0.31167979835318943,0.9187705817782656,0.9264058124574283,0.7921348314606742
55
+ CaiT,f,0.1541684599891403,0.9499717886025955,0.9222261921687871,0.3464373464373464
56
+ CaiT,g,0.07805611325552066,0.9708333333333333,0.9986172777777778,0.9714937286202965
57
+ CaiT,h,0.13856186520308256,0.9553333333333334,0.997130611111111,0.9569961489088575
58
+ CaiT,i,0.011666435472667217,0.9956666666666667,0.9999013333333333,0.9956594323873121
59
+ CaiT,j,1.8389671653707822,0.6116666666666667,0.7423962222222222,0.4151606425702811
60
+ CaiT,k,1.7725774958133698,0.6365,0.8888650555555555,0.4312907431551499
61
+ CaiT,l,0.7395369254032035,0.8362991463268006,0.8693810723675515,0.7436693965922997
62
+ DeiT,a,0.48058320357736234,0.8263274336283186,0.8941450218931248,0.6594360086767896
63
+ DeiT,b,0.23002449519573911,0.9251807607670544,0.9313581952117864,0.5608856088560885
64
+ DeiT,c,0.49494195908204974,0.8154668343288274,0.8907605893186004,0.34118967452300786
65
+ DeiT,d,0.05036040664735698,0.9849104055328513,0.9769023941068141,0.8636363636363636
66
+ DeiT,e,0.338863200106291,0.8792535675082327,0.9161961704382048,0.7342995169082126
67
+ DeiT,f,0.26403015722496653,0.9037050968591311,0.9291450866890099,0.2289156626506024
68
+ DeiT,g,0.10851164469867945,0.9641666666666666,0.9990410000000001,0.9653393519264872
69
+ DeiT,h,0.2489620513096452,0.906,0.9981344444444444,0.9139194139194139
70
+ DeiT,i,0.013259729760388533,0.9958333333333333,0.9998315555555556,0.9958423415932147
71
+ DeiT,j,1.2026229511300723,0.7143333333333334,0.7246498888888889,0.6356292517006803
72
+ DeiT,k,1.1073710439900557,0.746,0.8698901111111111,0.6623836951705804
73
+ DeiT,l,0.5658274294531473,0.8476012985451485,0.867833726587774,0.7854785478547854
74
+ DeiT3,a,0.39277621998196155,0.8661504424778761,0.9195532732705195,0.7125890736342043
75
+ DeiT3,b,0.338128161960636,0.8824269097767997,0.9331012891344382,0.44510385756676557
76
+ DeiT3,c,0.323060417608134,0.8883998742533794,0.922292817679558,0.4580152671755725
77
+ DeiT3,d,0.12409640010358478,0.9553599497013517,0.9608121546961326,0.6787330316742082
78
+ DeiT3,e,0.24973662732461413,0.9209659714599341,0.9483084840687203,0.8064516129032258
79
+ DeiT3,f,0.2540075041596123,0.9116042881324055,0.9380772021883802,0.24193548387096775
80
+ DeiT3,g,0.1656125110021482,0.9416666666666667,0.9990236666666666,0.944760101010101
81
+ DeiT3,h,0.15762409150910875,0.9448333333333333,0.9990646111111111,0.9476017096723128
82
+ DeiT3,i,0.05214000094247361,0.9803333333333333,0.9997376666666667,0.9806684141546527
83
+ DeiT3,j,1.1591287109454473,0.696,0.7744774999999999,0.6248457424928013
84
+ DeiT3,k,1.0456561943689981,0.7346666666666667,0.845634,0.6561555075593952
85
+ DeiT3,l,0.5223108836063022,0.854033906456655,0.8898184372191467,0.7933968686181075
86
+ Twins_SVT,a,0.4211153812640536,0.8307522123893806,0.8825833123189902,0.6433566433566433
87
+ Twins_SVT,b,0.3625493723054758,0.8550770198050928,0.8962191528545118,0.37449118046132973
88
+ Twins_SVT,c,0.47319920195681764,0.7868594781515247,0.8548139963167587,0.2893081761006289
89
+ Twins_SVT,d,0.1203458983801289,0.9783087079534738,0.9818324125230202,0.8
90
+ Twins_SVT,e,0.5213294555274637,0.7486278814489572,0.8316203738742148,0.5465346534653466
91
+ Twins_SVT,f,0.3335461875583885,0.8666541282678202,0.9034523383543173,0.16292798110979928
92
+ Twins_SVT,g,0.2639119902451833,0.9085,0.9744078888888889,0.912676952441546
93
+ Twins_SVT,h,0.32257486327489215,0.8723333333333333,0.9662636666666669,0.8822263222632226
94
+ Twins_SVT,i,0.13550377811988196,0.9738333333333333,0.9972788888888889,0.9733672603901612
95
+ Twins_SVT,j,1.2430085968176523,0.49,0.43771377777777776,0.1896186440677966
96
+ Twins_SVT,k,1.1146003757913907,0.5553333333333333,0.7234002222222222,0.2115839243498818
97
+ Twins_SVT,l,0.6286477774643219,0.7480461704941685,0.7275090480198628,0.6162439337057046
98
+ Twins_PCPVT,a,0.45601994748664115,0.7699115044247787,0.8394007473464615,0.5458515283842795
99
+ Twins_PCPVT,b,0.3125818614145001,0.8773970449544168,0.9010699815837937,0.390625
100
+ Twins_PCPVT,c,0.5049686531944119,0.7500785916378497,0.8135911602209945,0.23923444976076555
101
+ Twins_PCPVT,d,0.3149096430453517,0.8918579063187677,0.9015690607734806,0.4208754208754209
102
+ Twins_PCPVT,e,0.42039827045572575,0.8079034028540066,0.8655339438431847,0.5882352941176471
103
+ Twins_PCPVT,f,0.3770137148085496,0.8412638706037239,0.8693597175042401,0.12899896800825594
104
+ Twins_PCPVT,g,0.2785677030881246,0.9015,0.9626754444444443,0.9027480664801711
105
+ Twins_PCPVT,h,0.3805647597312927,0.834,0.928301,0.8463437210737427
106
+ Twins_PCPVT,i,0.2798018006483714,0.9091666666666667,0.9656723333333334,0.9096335599403084
107
+ Twins_PCPVT,j,0.614702238559723,0.6835,0.7995154444444446,0.6018033130635353
108
+ Twins_PCPVT,k,0.6159363424777985,0.6911666666666667,0.7903985,0.6076646199449502
109
+ Twins_PCPVT,l,0.45535326129802217,0.7889864133702056,0.8498913163479216,0.7103004291845494
110
+ PiT,a,0.3937257931823224,0.8296460176991151,0.8874127904755356,0.641860465116279
111
+ PiT,b,0.2796248870145521,0.8777114115058158,0.91848802946593,0.4150375939849624
112
+ PiT,c,0.5313189482209218,0.7613957874882112,0.8498581952117863,0.26666666666666666
113
+ PiT,d,0.049343678185640734,0.9798805407104684,0.9911620626151012,0.8117647058823529
114
+ PiT,e,0.3259278782832505,0.8518111964873765,0.9145841216983274,0.6715328467153284
115
+ PiT,f,0.2841162405192056,0.8750235094978371,0.9172267022129574,0.17196261682242991
116
+ PiT,g,0.1590204114516576,0.9338333333333333,0.9916004444444445,0.9369340746624305
117
+ PiT,h,0.2924602138201396,0.8721666666666666,0.981646111111111,0.8849212303075769
118
+ PiT,i,0.03693298858900865,0.988,0.999485,0.9879396984924623
119
+ PiT,j,2.9977854507366817,0.461,0.277717,0.06477732793522267
120
+ PiT,k,2.8756980224698783,0.5151666666666667,0.7229978888888889,0.07149696776252792
121
+ PiT,l,1.2244331041709067,0.7434170975111218,0.6790239785353327,0.599849990624414
122
+ Ensemble,a,,0.9070796460176991,0.941851401847734,0.79
123
+ Ensemble,b,,0.9374410562716127,0.9600349907918969,0.6135922330097088
124
+ Ensemble,c,,0.895001571832757,0.9307624309392265,0.48615384615384616
125
+ Ensemble,d,,0.9911977365608299,0.9944677716390424,0.9186046511627907
126
+ Ensemble,e,,0.9264544456641054,0.955384848255506,0.825065274151436
127
+ Ensemble,f,,0.941696445363927,0.9599335198386041,0.33760683760683763
128
+ Ensemble,g,,0.9701666666666666,0.9990522222222222,0.9710027539283979
129
+ Ensemble,h,,0.9476666666666667,0.9979163333333333,0.9502219403931516
130
+ Ensemble,i,,0.9986666666666667,0.9999886666666667,0.9986671109630123
131
+ Ensemble,j,,0.5698333333333333,0.6426453333333333,0.31556616282153277
132
+ Ensemble,k,,0.5983333333333334,0.8897323333333333,0.33055555555555555
133
+ Ensemble,l,,0.8179632078874595,0.832089495815299,0.712386018237082
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:be02cc6e045512c352a07c5c5d91d58e636462c47757069251e2db386985c120
3
+ size 125239576
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e003815d79d6f2a95583e2f46e27a357d42c225b8e9b6448e6464767bca64c9d
3
+ size 125471131
roc_confusion_matrix/CvT_roc_confusion_matrix_a.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_b.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_c.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_d.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_e.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_f.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_g.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_h.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_i.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_j.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_k.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_l.png ADDED
roc_curves/CvT_ROC_a.png ADDED
roc_curves/CvT_ROC_b.png ADDED
roc_curves/CvT_ROC_c.png ADDED
roc_curves/CvT_ROC_d.png ADDED
roc_curves/CvT_ROC_e.png ADDED
roc_curves/CvT_ROC_f.png ADDED
roc_curves/CvT_ROC_g.png ADDED
roc_curves/CvT_ROC_h.png ADDED
roc_curves/CvT_ROC_i.png ADDED
roc_curves/CvT_ROC_j.png ADDED
roc_curves/CvT_ROC_k.png ADDED
roc_curves/CvT_ROC_l.png ADDED
training_curves/CvT_accuracy.png ADDED
training_curves/CvT_auc.png ADDED
training_curves/CvT_combined_metrics.png ADDED

Git LFS Details

  • SHA256: f9cddad3d9a2305e35dabb091fd81d5f3f50bad7d285fdf097de841427120b40
  • Pointer size: 131 Bytes
  • Size of remote file: 167 kB
training_curves/CvT_f1.png ADDED
training_curves/CvT_loss.png ADDED
training_curves/CvT_metrics.csv ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.5454545855617974,0.41878439229709413,0.6942059718849403,0.793002915451895,0.7856951154682903,0.8944710537276137,0.6973220935743788,0.8035961272475796
3
+ 2,0.4716189381318856,0.37400962720807024,0.7424923897800731,0.8294460641399417,0.846225792479242,0.9113454003008951,0.7393520802499589,0.8294460641399417
4
+ 3,0.4498043883640691,0.36516355677526824,0.7571399254369463,0.8250728862973761,0.8614838160290162,0.9176437963773598,0.7559042937192754,0.8288159771754636
5
+ 4,0.43645898957468926,0.3376608351055457,0.7647073913192188,0.8534985422740525,0.870118987584132,0.928606490492907,0.7661530878466232,0.85360524399126
6
+ 5,0.4270626576820909,0.33928940812283287,0.7684184423846496,0.8411078717201166,0.8753239019916431,0.9327937339033906,0.7710631535346876,0.8481894150417827
7
+ 6,0.4177690687775918,0.3458701694498257,0.7748486506823545,0.8483965014577259,0.8810562526193122,0.9344427066953395,0.7753270134901064,0.8567493112947658
8
+ 7,0.41739760828490347,0.3428083369752756,0.7742329924410849,0.8447521865889213,0.8811244116648936,0.9428214434461831,0.7766942666001336,0.8561782579338285
9
+ 8,0.4087680892544022,0.38979807643779163,0.7779782467421418,0.8126822157434402,0.8860292670244734,0.9330072928796674,0.7794660981679506,0.8305866842452209
10
+ 9,0.40660411412927855,0.3142256221079618,0.7802613127201833,0.8520408163265306,0.8873070539085166,0.9437426157468402,0.7814610085891658,0.8579426172148356
11
+ 10,0.4018868911731959,0.3325835931231607,0.7824759722269726,0.8520408163265306,0.889890965812465,0.9431497505291162,0.7827750215611098,0.8602890571231934
12
+ 11,0.4010628289264208,0.2987588767407587,0.7822279987686835,0.869533527696793,0.8902885009233055,0.9464413212181999,0.785134565089007,0.8725978647686833
13
+ 12,0.3981729524995175,0.3306502518257664,0.7838954065054554,0.847667638483965,0.8916990875405698,0.9435046196737754,0.783278308965399,0.8569472963723477
14
+ 13,0.3931052854340262,0.3015700927976269,0.7860245579231795,0.8651603498542274,0.8943008290085421,0.9444746661680081,0.7864153906556733,0.8666186012977649
15
+ 14,0.3904137644843763,0.2880233984560035,0.7882050141943427,0.8709912536443148,0.8961008655325486,0.951043995274078,0.7891730859258629,0.8731182795698925
16
+ 15,0.3862014959101255,0.3049151019000451,0.7903256148031603,0.8658892128279884,0.8981321271951521,0.9465645691846084,0.7905959914260583,0.8698727015558698
17
+ 16,0.3882054155839306,0.3170544022895157,0.7883247255190341,0.8564139941690962,0.8968897856469992,0.9471149351035707,0.7883735840991665,0.8638562543192813
18
+ 17,0.3851278321350853,0.28406817198842327,0.7916253377569518,0.8760932944606414,0.8989055089189882,0.9520544161021343,0.7955157627986205,0.8782234957020058
19
+ 18,0.38282828246470324,0.3047257037534658,0.7919759209221192,0.8673469387755102,0.8998374235231965,0.9480743567731132,0.7940539076256264,0.8727272727272727
20
+ 19,0.3831823689526514,0.3018142296864757,0.7938742004993672,0.8651603498542274,0.900099528175156,0.9501557599299613,0.7941874562437033,0.8714384989576095
21
+ 20,0.3803508949026174,0.2997558918882042,0.7936005746143585,0.8673469387755102,0.9014052887832577,0.9500909484993498,0.7942269658323672,0.8714689265536724
22
+ 21,0.37627306820393247,0.28655936141055804,0.7946950781543934,0.8753644314868805,0.9031548813255292,0.9528491529889757,0.797055144199885,0.8788093550673282
23
+ 22,0.3791275146530498,0.31103754304240816,0.7934039060095085,0.8534985422740525,0.9016801477021054,0.9532358966077061,0.7981857516350788,0.8624229979466119
24
+ 23,0.37644750438942753,0.31795066668931665,0.7943701474159456,0.8615160349854227,0.9028640619885934,0.9489806543192038,0.7965378953246358,0.8695054945054945
25
+ 24,0.37052251767511996,0.2799088840234384,0.7974911926668263,0.8717201166180758,0.9060147108728265,0.9552376135793758,0.7974548222395169,0.875177304964539
26
+ 25,0.36707203877373,0.2850041752436766,0.8003557136505114,0.8826530612244898,0.9082701132862941,0.951667672483404,0.8052840511058478,0.8837545126353791
27
+ 26,0.36778277345887245,0.2616846166094955,0.8007233984334918,0.8848396501457726,0.9079216477163619,0.9584622478729101,0.8032353661316605,0.886002886002886
28
+ 27,0.36552157687697256,0.264149998306533,0.8028354482334029,0.8848396501457726,0.9094252403564035,0.9586109954185756,0.8056014568509089,0.8868194842406877
29
+ 28,0.36577115871370763,0.26402455794220414,0.8010483291719397,0.8892128279883382,0.9086845302272508,0.9578290083213628,0.8036059457588777,0.8904899135446686
30
+ 29,0.36358578548951015,0.2691162500541342,0.8014844204261723,0.8855685131195336,0.9094048793586998,0.9570927079703184,0.8067298246782438,0.8874551971326164
31
+ 30,0.3637443440624353,0.3012346520194507,0.8024848650682355,0.8651603498542274,0.9096946158427448,0.9518047327219101,0.8085152241132048,0.8710801393728222
32
+ 31,0.36424110488199757,0.27704282674080427,0.803262988678729,0.8797376093294461,0.9098558322387691,0.9565019677175326,0.8082091294054883,0.8838845883180858
33
+ 32,0.36254328096247684,0.2780654456455576,0.8035964702260834,0.8790087463556852,0.9103594376525266,0.9561332863007762,0.8075185827655847,0.882768361581921
34
+ 33,0.3634739719800414,0.28042872528119267,0.8026131272018333,0.8760932944606414,0.9098308456093774,0.9572223308315412,0.8052246110229843,0.8812849162011173
35
+ 34,0.3610940596292188,0.2735784603364266,0.8035793686082704,0.8790087463556852,0.9111481958874158,0.9572212683490723,0.8046401265489059,0.882768361581921
36
+ 35,0.3633829878393609,0.28090626314152084,0.8042805349386052,0.8753644314868805,0.9104869063276791,0.954181506005151,0.8048329197895616,0.8789808917197452
37
+ 36,0.3625948407028851,0.29401117797843224,0.8001675958545679,0.8760932944606414,0.9095737659505166,0.9544566889646321,0.8021403051289432,0.8816155988857939
38
+ 37,0.36652841117079754,0.28202002423845296,0.8010397783630332,0.8790087463556852,0.9084846772598678,0.9563064709432294,0.8041743814172698,0.8830985915492958
39
+ 38,0.36131294218203874,0.2753060121230412,0.8036135718438965,0.8797376093294461,0.9110813176937551,0.9563139083205127,0.8076030593182713,0.8822269807280514
40
+ 39,0.3631269225699266,0.29128056512629674,0.8038615453021856,0.8753644314868805,0.9104374160753913,0.9542070055844079,0.8048162014976175,0.8806699232379623
41
+ 40,0.36260165742986744,0.27745224704895355,0.8035109621370181,0.880466472303207,0.9103856698917665,0.9557199806203198,0.8046983231202033,0.8836879432624114
42
+ 41,0.365013342035874,0.2783045386433949,0.8008773129938093,0.8775510204081632,0.9087424374855558,0.9552854252904827,0.802844685264361,0.8801711840228246
43
+ 42,0.36325351914849174,0.2887538745347682,0.8024848650682355,0.8746355685131195,0.9100745935285524,0.9538043247286421,0.8047405303510596,0.8788732394366198
44
+ 43,0.36252303772709515,0.2807743189633761,0.8017067414577419,0.8753644314868805,0.9098011751476967,0.9547541840559632,0.8035977437877941,0.8786373314407381
45
+ 44,0.3638359864524954,0.2731195484062673,0.8016383349864897,0.8833819241982507,0.9094158889236511,0.9554660473102196,0.8033167720821393,0.8848920863309353
46
+ 45,0.36243859979491244,0.27358127413616234,0.8031774805896638,0.8811953352769679,0.9104034476894268,0.9562862837763176,0.805581363920469,0.8836545324768023
47
+ 46,0.3623487997137416,0.3242336612401134,0.8035879194171769,0.8513119533527697,0.9104723615880296,0.9527120927504696,0.8057472430823355,0.8623481781376519
training_metrics.csv ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.5454545855617974,0.41878439229709413,0.6942059718849403,0.793002915451895,0.7856951154682903,0.8944710537276137,0.6973220935743788,0.8035961272475796
3
+ 2,0.4716189381318856,0.37400962720807024,0.7424923897800731,0.8294460641399417,0.846225792479242,0.9113454003008951,0.7393520802499589,0.8294460641399417
4
+ 3,0.4498043883640691,0.36516355677526824,0.7571399254369463,0.8250728862973761,0.8614838160290162,0.9176437963773598,0.7559042937192754,0.8288159771754636
5
+ 4,0.43645898957468926,0.3376608351055457,0.7647073913192188,0.8534985422740525,0.870118987584132,0.928606490492907,0.7661530878466232,0.85360524399126
6
+ 5,0.4270626576820909,0.33928940812283287,0.7684184423846496,0.8411078717201166,0.8753239019916431,0.9327937339033906,0.7710631535346876,0.8481894150417827
7
+ 6,0.4177690687775918,0.3458701694498257,0.7748486506823545,0.8483965014577259,0.8810562526193122,0.9344427066953395,0.7753270134901064,0.8567493112947658
8
+ 7,0.41739760828490347,0.3428083369752756,0.7742329924410849,0.8447521865889213,0.8811244116648936,0.9428214434461831,0.7766942666001336,0.8561782579338285
9
+ 8,0.4087680892544022,0.38979807643779163,0.7779782467421418,0.8126822157434402,0.8860292670244734,0.9330072928796674,0.7794660981679506,0.8305866842452209
10
+ 9,0.40660411412927855,0.3142256221079618,0.7802613127201833,0.8520408163265306,0.8873070539085166,0.9437426157468402,0.7814610085891658,0.8579426172148356
11
+ 10,0.4018868911731959,0.3325835931231607,0.7824759722269726,0.8520408163265306,0.889890965812465,0.9431497505291162,0.7827750215611098,0.8602890571231934
12
+ 11,0.4010628289264208,0.2987588767407587,0.7822279987686835,0.869533527696793,0.8902885009233055,0.9464413212181999,0.785134565089007,0.8725978647686833
13
+ 12,0.3981729524995175,0.3306502518257664,0.7838954065054554,0.847667638483965,0.8916990875405698,0.9435046196737754,0.783278308965399,0.8569472963723477
14
+ 13,0.3931052854340262,0.3015700927976269,0.7860245579231795,0.8651603498542274,0.8943008290085421,0.9444746661680081,0.7864153906556733,0.8666186012977649
15
+ 14,0.3904137644843763,0.2880233984560035,0.7882050141943427,0.8709912536443148,0.8961008655325486,0.951043995274078,0.7891730859258629,0.8731182795698925
16
+ 15,0.3862014959101255,0.3049151019000451,0.7903256148031603,0.8658892128279884,0.8981321271951521,0.9465645691846084,0.7905959914260583,0.8698727015558698
17
+ 16,0.3882054155839306,0.3170544022895157,0.7883247255190341,0.8564139941690962,0.8968897856469992,0.9471149351035707,0.7883735840991665,0.8638562543192813
18
+ 17,0.3851278321350853,0.28406817198842327,0.7916253377569518,0.8760932944606414,0.8989055089189882,0.9520544161021343,0.7955157627986205,0.8782234957020058
19
+ 18,0.38282828246470324,0.3047257037534658,0.7919759209221192,0.8673469387755102,0.8998374235231965,0.9480743567731132,0.7940539076256264,0.8727272727272727
20
+ 19,0.3831823689526514,0.3018142296864757,0.7938742004993672,0.8651603498542274,0.900099528175156,0.9501557599299613,0.7941874562437033,0.8714384989576095
21
+ 20,0.3803508949026174,0.2997558918882042,0.7936005746143585,0.8673469387755102,0.9014052887832577,0.9500909484993498,0.7942269658323672,0.8714689265536724
22
+ 21,0.37627306820393247,0.28655936141055804,0.7946950781543934,0.8753644314868805,0.9031548813255292,0.9528491529889757,0.797055144199885,0.8788093550673282
23
+ 22,0.3791275146530498,0.31103754304240816,0.7934039060095085,0.8534985422740525,0.9016801477021054,0.9532358966077061,0.7981857516350788,0.8624229979466119
24
+ 23,0.37644750438942753,0.31795066668931665,0.7943701474159456,0.8615160349854227,0.9028640619885934,0.9489806543192038,0.7965378953246358,0.8695054945054945
25
+ 24,0.37052251767511996,0.2799088840234384,0.7974911926668263,0.8717201166180758,0.9060147108728265,0.9552376135793758,0.7974548222395169,0.875177304964539
26
+ 25,0.36707203877373,0.2850041752436766,0.8003557136505114,0.8826530612244898,0.9082701132862941,0.951667672483404,0.8052840511058478,0.8837545126353791
27
+ 26,0.36778277345887245,0.2616846166094955,0.8007233984334918,0.8848396501457726,0.9079216477163619,0.9584622478729101,0.8032353661316605,0.886002886002886
28
+ 27,0.36552157687697256,0.264149998306533,0.8028354482334029,0.8848396501457726,0.9094252403564035,0.9586109954185756,0.8056014568509089,0.8868194842406877
29
+ 28,0.36577115871370763,0.26402455794220414,0.8010483291719397,0.8892128279883382,0.9086845302272508,0.9578290083213628,0.8036059457588777,0.8904899135446686
30
+ 29,0.36358578548951015,0.2691162500541342,0.8014844204261723,0.8855685131195336,0.9094048793586998,0.9570927079703184,0.8067298246782438,0.8874551971326164
31
+ 30,0.3637443440624353,0.3012346520194507,0.8024848650682355,0.8651603498542274,0.9096946158427448,0.9518047327219101,0.8085152241132048,0.8710801393728222
32
+ 31,0.36424110488199757,0.27704282674080427,0.803262988678729,0.8797376093294461,0.9098558322387691,0.9565019677175326,0.8082091294054883,0.8838845883180858
33
+ 32,0.36254328096247684,0.2780654456455576,0.8035964702260834,0.8790087463556852,0.9103594376525266,0.9561332863007762,0.8075185827655847,0.882768361581921
34
+ 33,0.3634739719800414,0.28042872528119267,0.8026131272018333,0.8760932944606414,0.9098308456093774,0.9572223308315412,0.8052246110229843,0.8812849162011173
35
+ 34,0.3610940596292188,0.2735784603364266,0.8035793686082704,0.8790087463556852,0.9111481958874158,0.9572212683490723,0.8046401265489059,0.882768361581921
36
+ 35,0.3633829878393609,0.28090626314152084,0.8042805349386052,0.8753644314868805,0.9104869063276791,0.954181506005151,0.8048329197895616,0.8789808917197452
37
+ 36,0.3625948407028851,0.29401117797843224,0.8001675958545679,0.8760932944606414,0.9095737659505166,0.9544566889646321,0.8021403051289432,0.8816155988857939
38
+ 37,0.36652841117079754,0.28202002423845296,0.8010397783630332,0.8790087463556852,0.9084846772598678,0.9563064709432294,0.8041743814172698,0.8830985915492958
39
+ 38,0.36131294218203874,0.2753060121230412,0.8036135718438965,0.8797376093294461,0.9110813176937551,0.9563139083205127,0.8076030593182713,0.8822269807280514
40
+ 39,0.3631269225699266,0.29128056512629674,0.8038615453021856,0.8753644314868805,0.9104374160753913,0.9542070055844079,0.8048162014976175,0.8806699232379623
41
+ 40,0.36260165742986744,0.27745224704895355,0.8035109621370181,0.880466472303207,0.9103856698917665,0.9557199806203198,0.8046983231202033,0.8836879432624114
42
+ 41,0.365013342035874,0.2783045386433949,0.8008773129938093,0.8775510204081632,0.9087424374855558,0.9552854252904827,0.802844685264361,0.8801711840228246
43
+ 42,0.36325351914849174,0.2887538745347682,0.8024848650682355,0.8746355685131195,0.9100745935285524,0.9538043247286421,0.8047405303510596,0.8788732394366198
44
+ 43,0.36252303772709515,0.2807743189633761,0.8017067414577419,0.8753644314868805,0.9098011751476967,0.9547541840559632,0.8035977437877941,0.8786373314407381
45
+ 44,0.3638359864524954,0.2731195484062673,0.8016383349864897,0.8833819241982507,0.9094158889236511,0.9554660473102196,0.8033167720821393,0.8848920863309353
46
+ 45,0.36243859979491244,0.27358127413616234,0.8031774805896638,0.8811953352769679,0.9104034476894268,0.9562862837763176,0.805581363920469,0.8836545324768023
47
+ 46,0.3623487997137416,0.3242336612401134,0.8035879194171769,0.8513119533527697,0.9104723615880296,0.9527120927504696,0.8057472430823355,0.8623481781376519