glpn-nyu-finetuned-diode-221122-030603
This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3597
- Mae: 0.3054
- Rmse: 0.4481
- Abs Rel: 0.3462
- Log Mae: 0.1256
- Log Rmse: 0.1798
- Delta1: 0.5278
- Delta2: 0.8055
- Delta3: 0.9191
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
1.3722 | 1.0 | 72 | 0.9950 | 2.2271 | 2.3100 | 3.3726 | 0.5488 | 0.6025 | 0.0176 | 0.0929 | 0.1687 |
0.4936 | 2.0 | 144 | 0.4493 | 0.3961 | 0.5488 | 0.4918 | 0.1648 | 0.2234 | 0.3827 | 0.6877 | 0.8553 |
0.4087 | 3.0 | 216 | 0.3971 | 0.3434 | 0.4737 | 0.4250 | 0.1437 | 0.1980 | 0.4393 | 0.7431 | 0.8979 |
0.3696 | 4.0 | 288 | 0.3895 | 0.3355 | 0.4652 | 0.4334 | 0.1405 | 0.1968 | 0.4412 | 0.7566 | 0.9174 |
0.3621 | 5.0 | 360 | 0.3926 | 0.3432 | 0.4766 | 0.4198 | 0.1416 | 0.1967 | 0.4372 | 0.7591 | 0.9095 |
0.3104 | 6.0 | 432 | 0.3615 | 0.3104 | 0.4447 | 0.3630 | 0.1281 | 0.1813 | 0.5032 | 0.7923 | 0.9237 |
0.2835 | 7.0 | 504 | 0.3744 | 0.3140 | 0.4506 | 0.3429 | 0.1326 | 0.1824 | 0.4801 | 0.7837 | 0.9135 |
0.2308 | 8.0 | 576 | 0.3615 | 0.3013 | 0.4458 | 0.3459 | 0.1243 | 0.1803 | 0.5334 | 0.8139 | 0.9173 |
0.2433 | 9.0 | 648 | 0.3609 | 0.3042 | 0.4374 | 0.3570 | 0.1258 | 0.1798 | 0.5223 | 0.8053 | 0.9229 |
0.2097 | 10.0 | 720 | 0.3628 | 0.3066 | 0.4312 | 0.3792 | 0.1276 | 0.1815 | 0.4973 | 0.8075 | 0.9254 |
0.1761 | 11.0 | 792 | 0.3637 | 0.3111 | 0.4413 | 0.3726 | 0.1286 | 0.1817 | 0.4982 | 0.7979 | 0.9223 |
0.22 | 12.0 | 864 | 0.3584 | 0.3077 | 0.4619 | 0.3345 | 0.1258 | 0.1808 | 0.5360 | 0.8007 | 0.9119 |
0.2087 | 13.0 | 936 | 0.3614 | 0.3078 | 0.4502 | 0.3513 | 0.1265 | 0.1808 | 0.5211 | 0.8055 | 0.9177 |
0.2036 | 14.0 | 1008 | 0.3601 | 0.3086 | 0.4486 | 0.3530 | 0.1267 | 0.1801 | 0.5184 | 0.8023 | 0.9189 |
0.2123 | 15.0 | 1080 | 0.3597 | 0.3054 | 0.4481 | 0.3462 | 0.1256 | 0.1798 | 0.5278 | 0.8055 | 0.9191 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu116
- Tokenizers 0.13.2
- Downloads last month
- 0