glpn-nyu-finetuned-diode-221214-081122
This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3242
- Mae: 0.2603
- Rmse: 0.3997
- Abs Rel: 0.3010
- Log Mae: 0.1073
- Log Rmse: 0.1624
- Delta1: 0.6187
- Delta2: 0.8455
- Delta3: 0.9378
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 25
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
0.6896 | 1.0 | 72 | 0.4753 | 0.4670 | 0.6226 | 0.5658 | 0.1791 | 0.2313 | 0.2950 | 0.6310 | 0.8562 |
0.3628 | 2.0 | 144 | 0.3565 | 0.2956 | 0.4307 | 0.3421 | 0.1226 | 0.1737 | 0.5285 | 0.8079 | 0.9245 |
0.3168 | 3.0 | 216 | 0.3486 | 0.2774 | 0.3963 | 0.3464 | 0.1172 | 0.1710 | 0.5561 | 0.8285 | 0.9358 |
0.2734 | 4.0 | 288 | 0.3368 | 0.2669 | 0.3962 | 0.3260 | 0.1122 | 0.1671 | 0.5787 | 0.8453 | 0.9383 |
0.2678 | 5.0 | 360 | 0.3419 | 0.2700 | 0.4136 | 0.3118 | 0.1130 | 0.1689 | 0.5869 | 0.8399 | 0.9289 |
0.2271 | 6.0 | 432 | 0.3270 | 0.2611 | 0.3890 | 0.3050 | 0.1091 | 0.1608 | 0.5899 | 0.8500 | 0.9453 |
0.1905 | 7.0 | 504 | 0.3331 | 0.2651 | 0.3996 | 0.3086 | 0.1110 | 0.1645 | 0.5925 | 0.8391 | 0.9405 |
0.1436 | 8.0 | 576 | 0.3323 | 0.2648 | 0.4039 | 0.3087 | 0.1097 | 0.1653 | 0.6019 | 0.8475 | 0.9345 |
0.1687 | 9.0 | 648 | 0.3274 | 0.2620 | 0.3887 | 0.3129 | 0.1092 | 0.1622 | 0.5954 | 0.8464 | 0.9422 |
0.1407 | 10.0 | 720 | 0.3344 | 0.2689 | 0.4134 | 0.3079 | 0.1107 | 0.1667 | 0.6039 | 0.8423 | 0.9287 |
0.1159 | 11.0 | 792 | 0.3302 | 0.2675 | 0.4081 | 0.3032 | 0.1103 | 0.1646 | 0.6035 | 0.8406 | 0.9334 |
0.16 | 12.0 | 864 | 0.3262 | 0.2599 | 0.3989 | 0.2986 | 0.1074 | 0.1621 | 0.6177 | 0.8460 | 0.9371 |
0.1385 | 13.0 | 936 | 0.3287 | 0.2616 | 0.3976 | 0.3114 | 0.1085 | 0.1643 | 0.6095 | 0.8472 | 0.9363 |
0.156 | 14.0 | 1008 | 0.3291 | 0.2690 | 0.4147 | 0.3048 | 0.1101 | 0.1654 | 0.6082 | 0.8390 | 0.9305 |
0.1534 | 15.0 | 1080 | 0.3267 | 0.2651 | 0.3994 | 0.3084 | 0.1096 | 0.1632 | 0.6030 | 0.8406 | 0.9376 |
0.1196 | 16.0 | 1152 | 0.3248 | 0.2588 | 0.4028 | 0.2908 | 0.1065 | 0.1615 | 0.6265 | 0.8467 | 0.9357 |
0.0983 | 17.0 | 1224 | 0.3249 | 0.2612 | 0.4046 | 0.2940 | 0.1075 | 0.1620 | 0.6198 | 0.8440 | 0.9346 |
0.1347 | 18.0 | 1296 | 0.3209 | 0.2608 | 0.4012 | 0.2971 | 0.1069 | 0.1614 | 0.6201 | 0.8473 | 0.9376 |
0.107 | 19.0 | 1368 | 0.3249 | 0.2624 | 0.4026 | 0.3013 | 0.1079 | 0.1628 | 0.6149 | 0.8453 | 0.9362 |
0.1214 | 20.0 | 1440 | 0.3213 | 0.2586 | 0.3976 | 0.2962 | 0.1065 | 0.1609 | 0.6219 | 0.8464 | 0.9382 |
0.0921 | 21.0 | 1512 | 0.3240 | 0.2600 | 0.3971 | 0.3028 | 0.1074 | 0.1624 | 0.6179 | 0.8457 | 0.9383 |
0.0906 | 22.0 | 1584 | 0.3239 | 0.2602 | 0.4025 | 0.2968 | 0.1069 | 0.1622 | 0.6227 | 0.8461 | 0.9365 |
0.0978 | 23.0 | 1656 | 0.3230 | 0.2588 | 0.3990 | 0.2969 | 0.1066 | 0.1617 | 0.6234 | 0.8462 | 0.9371 |
0.1377 | 24.0 | 1728 | 0.3244 | 0.2612 | 0.4013 | 0.3005 | 0.1076 | 0.1626 | 0.6180 | 0.8447 | 0.9359 |
0.1253 | 25.0 | 1800 | 0.3242 | 0.2603 | 0.3997 | 0.3010 | 0.1073 | 0.1624 | 0.6187 | 0.8455 | 0.9378 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu116
- Tokenizers 0.13.2
- Downloads last month
- 0