glpn-kitti-finetuned-diode-221214-123047
This model is a fine-tuned version of vinvino02/glpn-kitti on the diode-subset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3497
- Mae: 0.2847
- Rmse: 0.3977
- Abs Rel: 0.3477
- Log Mae: 0.1203
- Log Rmse: 0.1726
- Delta1: 0.5217
- Delta2: 0.8246
- Delta3: 0.9436
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 25
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
0.6103 | 1.0 | 72 | 0.4449 | 0.3914 | 0.5513 | 0.4625 | 0.1615 | 0.2186 | 0.3918 | 0.6910 | 0.8549 |
0.3762 | 2.0 | 144 | 0.4095 | 0.3583 | 0.4876 | 0.4281 | 0.1505 | 0.2015 | 0.4065 | 0.7121 | 0.8901 |
0.341 | 3.0 | 216 | 0.3768 | 0.3046 | 0.4061 | 0.4016 | 0.1313 | 0.1840 | 0.4757 | 0.7938 | 0.9309 |
0.291 | 4.0 | 288 | 0.3853 | 0.3227 | 0.4495 | 0.3724 | 0.1360 | 0.1869 | 0.4646 | 0.7680 | 0.9127 |
0.2861 | 5.0 | 360 | 0.3786 | 0.3151 | 0.4257 | 0.4065 | 0.1344 | 0.1876 | 0.4597 | 0.7785 | 0.9329 |
0.2539 | 6.0 | 432 | 0.3687 | 0.3158 | 0.4546 | 0.3329 | 0.1316 | 0.1821 | 0.4732 | 0.7869 | 0.9138 |
0.2199 | 7.0 | 504 | 0.3705 | 0.3122 | 0.4479 | 0.3378 | 0.1312 | 0.1820 | 0.4784 | 0.7888 | 0.9189 |
0.1728 | 8.0 | 576 | 0.3578 | 0.2895 | 0.4008 | 0.3675 | 0.1235 | 0.1766 | 0.5101 | 0.8178 | 0.9420 |
0.1877 | 9.0 | 648 | 0.3589 | 0.2846 | 0.3846 | 0.3721 | 0.1235 | 0.1764 | 0.5144 | 0.8170 | 0.9403 |
0.1541 | 10.0 | 720 | 0.3521 | 0.2831 | 0.3997 | 0.3283 | 0.1201 | 0.1712 | 0.5241 | 0.8260 | 0.9422 |
0.1414 | 11.0 | 792 | 0.3460 | 0.2735 | 0.3772 | 0.3419 | 0.1173 | 0.1691 | 0.5409 | 0.8360 | 0.9469 |
0.1643 | 12.0 | 864 | 0.3530 | 0.2878 | 0.4100 | 0.3313 | 0.1214 | 0.1736 | 0.5249 | 0.8214 | 0.9344 |
0.1724 | 13.0 | 936 | 0.3606 | 0.2995 | 0.4249 | 0.3459 | 0.1255 | 0.1775 | 0.5057 | 0.8069 | 0.9323 |
0.1514 | 14.0 | 1008 | 0.3477 | 0.2832 | 0.3881 | 0.3596 | 0.1206 | 0.1726 | 0.5174 | 0.8253 | 0.9437 |
0.1535 | 15.0 | 1080 | 0.3535 | 0.2961 | 0.4242 | 0.3412 | 0.1231 | 0.1753 | 0.5186 | 0.8080 | 0.9332 |
0.1233 | 16.0 | 1152 | 0.3508 | 0.2896 | 0.4104 | 0.3391 | 0.1213 | 0.1727 | 0.5225 | 0.8165 | 0.9398 |
0.116 | 17.0 | 1224 | 0.3519 | 0.2874 | 0.3989 | 0.3533 | 0.1215 | 0.1731 | 0.5200 | 0.8179 | 0.9407 |
0.1532 | 18.0 | 1296 | 0.3532 | 0.2965 | 0.4200 | 0.3459 | 0.1236 | 0.1747 | 0.5147 | 0.8035 | 0.9353 |
0.1179 | 19.0 | 1368 | 0.3497 | 0.2828 | 0.3896 | 0.3557 | 0.1204 | 0.1728 | 0.5200 | 0.8260 | 0.9457 |
0.1326 | 20.0 | 1440 | 0.3467 | 0.2787 | 0.3848 | 0.3475 | 0.1185 | 0.1704 | 0.5257 | 0.8330 | 0.9479 |
0.1069 | 21.0 | 1512 | 0.3471 | 0.2807 | 0.3922 | 0.3418 | 0.1187 | 0.1707 | 0.5288 | 0.8297 | 0.9452 |
0.1049 | 22.0 | 1584 | 0.3474 | 0.2864 | 0.4048 | 0.3387 | 0.1199 | 0.1717 | 0.5227 | 0.8251 | 0.9428 |
0.103 | 23.0 | 1656 | 0.3483 | 0.2840 | 0.3991 | 0.3416 | 0.1196 | 0.1717 | 0.5254 | 0.8269 | 0.9431 |
0.1184 | 24.0 | 1728 | 0.3473 | 0.2839 | 0.3960 | 0.3450 | 0.1198 | 0.1717 | 0.5223 | 0.8251 | 0.9443 |
0.1258 | 25.0 | 1800 | 0.3497 | 0.2847 | 0.3977 | 0.3477 | 0.1203 | 0.1726 | 0.5217 | 0.8246 | 0.9436 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu116
- Tokenizers 0.13.2
- Downloads last month
- 0