glpn-nyu-finetuned-diode-221228-113625
This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3996
- Mae: 0.4013
- Rmse: 0.6161
- Abs Rel: 0.3535
- Log Mae: 0.1568
- Log Rmse: 0.2121
- Delta1: 0.4381
- Delta2: 0.7025
- Delta3: 0.8196
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 75
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
1.0075 | 1.0 | 72 | 0.4809 | 0.4610 | 0.6461 | 0.5165 | 0.1901 | 0.2446 | 0.3157 | 0.5632 | 0.8017 |
0.4692 | 2.0 | 144 | 0.4432 | 0.4491 | 0.6531 | 0.3950 | 0.1821 | 0.2318 | 0.3347 | 0.6198 | 0.7910 |
0.4635 | 3.0 | 216 | 0.4361 | 0.4278 | 0.6252 | 0.4165 | 0.1715 | 0.2230 | 0.3780 | 0.6285 | 0.8090 |
0.4364 | 4.0 | 288 | 0.4255 | 0.4200 | 0.6222 | 0.3930 | 0.1673 | 0.2198 | 0.3824 | 0.6639 | 0.8206 |
0.4632 | 5.0 | 360 | 0.4376 | 0.4267 | 0.6241 | 0.4144 | 0.1708 | 0.2235 | 0.3806 | 0.6337 | 0.8122 |
0.4703 | 6.0 | 432 | 0.4340 | 0.4315 | 0.6354 | 0.3799 | 0.1740 | 0.2262 | 0.3788 | 0.6275 | 0.7945 |
0.4136 | 7.0 | 504 | 0.4453 | 0.4291 | 0.6368 | 0.4144 | 0.1726 | 0.2306 | 0.3965 | 0.6458 | 0.7965 |
0.394 | 8.0 | 576 | 0.4620 | 0.4440 | 0.6297 | 0.4728 | 0.1808 | 0.2336 | 0.3606 | 0.5832 | 0.7826 |
0.4073 | 9.0 | 648 | 0.4485 | 0.4372 | 0.6244 | 0.4439 | 0.1769 | 0.2266 | 0.3511 | 0.6010 | 0.8002 |
0.3967 | 10.0 | 720 | 0.4523 | 0.4320 | 0.6250 | 0.4606 | 0.1750 | 0.2307 | 0.3676 | 0.6255 | 0.8146 |
0.3797 | 11.0 | 792 | 0.4413 | 0.4360 | 0.6332 | 0.4047 | 0.1769 | 0.2258 | 0.3426 | 0.6277 | 0.8025 |
0.439 | 12.0 | 864 | 0.4544 | 0.4365 | 0.6356 | 0.4215 | 0.1768 | 0.2299 | 0.3561 | 0.6282 | 0.8050 |
0.4666 | 13.0 | 936 | 0.4349 | 0.4278 | 0.6267 | 0.3893 | 0.1729 | 0.2227 | 0.3615 | 0.6375 | 0.8053 |
0.4071 | 14.0 | 1008 | 0.4337 | 0.4220 | 0.6235 | 0.3822 | 0.1692 | 0.2202 | 0.3909 | 0.6376 | 0.8044 |
0.4359 | 15.0 | 1080 | 0.4259 | 0.4193 | 0.6266 | 0.3855 | 0.1669 | 0.2217 | 0.4022 | 0.6601 | 0.8100 |
0.39 | 16.0 | 1152 | 0.4268 | 0.4075 | 0.6161 | 0.3981 | 0.1605 | 0.2184 | 0.4214 | 0.6838 | 0.8205 |
0.3654 | 17.0 | 1224 | 0.4503 | 0.4461 | 0.6615 | 0.3791 | 0.1840 | 0.2417 | 0.3783 | 0.6161 | 0.7636 |
0.4256 | 18.0 | 1296 | 0.4743 | 0.4529 | 0.6319 | 0.5162 | 0.1852 | 0.2398 | 0.3461 | 0.5736 | 0.7490 |
0.372 | 19.0 | 1368 | 0.4462 | 0.4326 | 0.6443 | 0.4068 | 0.1752 | 0.2331 | 0.3875 | 0.6410 | 0.7922 |
0.41 | 20.0 | 1440 | 0.4351 | 0.4500 | 0.6579 | 0.3735 | 0.1849 | 0.2365 | 0.3460 | 0.6021 | 0.7751 |
0.3683 | 21.0 | 1512 | 0.4060 | 0.4084 | 0.6177 | 0.3495 | 0.1605 | 0.2107 | 0.4168 | 0.6702 | 0.8235 |
0.36 | 22.0 | 1584 | 0.4447 | 0.4517 | 0.6667 | 0.3788 | 0.1852 | 0.2414 | 0.3676 | 0.6122 | 0.7572 |
0.4257 | 23.0 | 1656 | 0.4297 | 0.4141 | 0.6180 | 0.4066 | 0.1646 | 0.2201 | 0.4134 | 0.6586 | 0.8105 |
0.4344 | 24.0 | 1728 | 0.4545 | 0.4312 | 0.6237 | 0.4587 | 0.1742 | 0.2296 | 0.3769 | 0.6137 | 0.8008 |
0.4057 | 25.0 | 1800 | 0.4161 | 0.4099 | 0.6175 | 0.3744 | 0.1619 | 0.2144 | 0.4100 | 0.6701 | 0.8231 |
0.3569 | 26.0 | 1872 | 0.4199 | 0.4120 | 0.6181 | 0.3840 | 0.1634 | 0.2177 | 0.4039 | 0.6765 | 0.8165 |
0.3479 | 27.0 | 1944 | 0.4327 | 0.4180 | 0.6174 | 0.4138 | 0.1668 | 0.2205 | 0.3912 | 0.6481 | 0.8230 |
0.3732 | 28.0 | 2016 | 0.4426 | 0.4291 | 0.6236 | 0.4296 | 0.1715 | 0.2237 | 0.3866 | 0.6186 | 0.7911 |
0.3554 | 29.0 | 2088 | 0.4112 | 0.4073 | 0.6180 | 0.3598 | 0.1607 | 0.2146 | 0.4281 | 0.6800 | 0.8189 |
0.3679 | 30.0 | 2160 | 0.4139 | 0.4078 | 0.6190 | 0.3702 | 0.1609 | 0.2165 | 0.4249 | 0.6823 | 0.8110 |
0.3703 | 31.0 | 2232 | 0.4143 | 0.4097 | 0.6176 | 0.3730 | 0.1618 | 0.2156 | 0.4153 | 0.6782 | 0.8162 |
0.3605 | 32.0 | 2304 | 0.4179 | 0.4177 | 0.6303 | 0.3711 | 0.1654 | 0.2210 | 0.4062 | 0.6823 | 0.8022 |
0.3761 | 33.0 | 2376 | 0.4027 | 0.4070 | 0.6222 | 0.3441 | 0.1595 | 0.2127 | 0.4371 | 0.6834 | 0.8125 |
0.3352 | 34.0 | 2448 | 0.4077 | 0.4029 | 0.6134 | 0.3692 | 0.1581 | 0.2130 | 0.4322 | 0.6855 | 0.8273 |
0.336 | 35.0 | 2520 | 0.4212 | 0.4246 | 0.6328 | 0.3780 | 0.1696 | 0.2238 | 0.3844 | 0.6716 | 0.8005 |
0.3414 | 36.0 | 2592 | 0.4139 | 0.4132 | 0.6241 | 0.3720 | 0.1639 | 0.2184 | 0.4162 | 0.6714 | 0.8092 |
0.3416 | 37.0 | 2664 | 0.4183 | 0.4101 | 0.6149 | 0.3844 | 0.1625 | 0.2159 | 0.4157 | 0.6649 | 0.8172 |
0.3765 | 38.0 | 2736 | 0.4207 | 0.4120 | 0.6199 | 0.3926 | 0.1635 | 0.2193 | 0.4066 | 0.6767 | 0.8154 |
0.3548 | 39.0 | 2808 | 0.4096 | 0.4056 | 0.6167 | 0.3667 | 0.1593 | 0.2138 | 0.4244 | 0.6905 | 0.8213 |
0.3822 | 40.0 | 2880 | 0.4084 | 0.4061 | 0.6180 | 0.3653 | 0.1593 | 0.2134 | 0.4246 | 0.6891 | 0.8249 |
0.3505 | 41.0 | 2952 | 0.4041 | 0.4118 | 0.6271 | 0.3515 | 0.1620 | 0.2156 | 0.4279 | 0.6872 | 0.8098 |
0.3514 | 42.0 | 3024 | 0.4033 | 0.4006 | 0.6185 | 0.3558 | 0.1563 | 0.2132 | 0.4510 | 0.7030 | 0.8181 |
0.3459 | 43.0 | 3096 | 0.4061 | 0.4051 | 0.6196 | 0.3631 | 0.1587 | 0.2147 | 0.4282 | 0.7019 | 0.8206 |
0.3213 | 44.0 | 3168 | 0.4041 | 0.4093 | 0.6232 | 0.3539 | 0.1605 | 0.2148 | 0.4301 | 0.6893 | 0.8168 |
0.3346 | 45.0 | 3240 | 0.4103 | 0.4023 | 0.6151 | 0.3705 | 0.1578 | 0.2141 | 0.4339 | 0.6907 | 0.8219 |
0.3585 | 46.0 | 3312 | 0.4054 | 0.3953 | 0.6096 | 0.3627 | 0.1542 | 0.2113 | 0.4524 | 0.7052 | 0.8251 |
0.3799 | 47.0 | 3384 | 0.4063 | 0.4100 | 0.6230 | 0.3574 | 0.1616 | 0.2165 | 0.4263 | 0.6821 | 0.8113 |
0.3235 | 48.0 | 3456 | 0.4051 | 0.4004 | 0.6117 | 0.3692 | 0.1571 | 0.2123 | 0.4364 | 0.6928 | 0.8268 |
0.3628 | 49.0 | 3528 | 0.4051 | 0.3985 | 0.6115 | 0.3622 | 0.1560 | 0.2111 | 0.4486 | 0.6932 | 0.8234 |
0.3399 | 50.0 | 3600 | 0.4145 | 0.4059 | 0.6184 | 0.3789 | 0.1598 | 0.2169 | 0.4260 | 0.6977 | 0.8194 |
0.3288 | 51.0 | 3672 | 0.4089 | 0.4057 | 0.6172 | 0.3692 | 0.1597 | 0.2153 | 0.4300 | 0.6939 | 0.8198 |
0.3231 | 52.0 | 3744 | 0.4104 | 0.4126 | 0.6261 | 0.3643 | 0.1628 | 0.2185 | 0.4296 | 0.6826 | 0.8104 |
0.3238 | 53.0 | 3816 | 0.4107 | 0.4023 | 0.6170 | 0.3745 | 0.1580 | 0.2167 | 0.4362 | 0.7031 | 0.8216 |
0.3253 | 54.0 | 3888 | 0.4056 | 0.4006 | 0.6135 | 0.3673 | 0.1570 | 0.2134 | 0.4400 | 0.7034 | 0.8221 |
0.3383 | 55.0 | 3960 | 0.4053 | 0.4060 | 0.6187 | 0.3598 | 0.1593 | 0.2141 | 0.4310 | 0.6938 | 0.8187 |
0.3279 | 56.0 | 4032 | 0.4118 | 0.4003 | 0.6130 | 0.3797 | 0.1569 | 0.2153 | 0.4388 | 0.7040 | 0.8212 |
0.32 | 57.0 | 4104 | 0.4042 | 0.4001 | 0.6185 | 0.3566 | 0.1560 | 0.2123 | 0.4470 | 0.7070 | 0.8227 |
0.3282 | 58.0 | 4176 | 0.4035 | 0.4010 | 0.6173 | 0.3533 | 0.1568 | 0.2126 | 0.4438 | 0.7037 | 0.8208 |
0.3271 | 59.0 | 4248 | 0.4015 | 0.4018 | 0.6168 | 0.3551 | 0.1570 | 0.2123 | 0.4334 | 0.7095 | 0.8201 |
0.3127 | 60.0 | 4320 | 0.4029 | 0.3975 | 0.6142 | 0.3590 | 0.1549 | 0.2113 | 0.4420 | 0.7082 | 0.8245 |
0.3142 | 61.0 | 4392 | 0.4044 | 0.4031 | 0.6163 | 0.3585 | 0.1577 | 0.2126 | 0.4273 | 0.7034 | 0.8214 |
0.3059 | 62.0 | 4464 | 0.4034 | 0.4033 | 0.6151 | 0.3624 | 0.1580 | 0.2127 | 0.4256 | 0.7038 | 0.8223 |
0.3133 | 63.0 | 4536 | 0.4028 | 0.4066 | 0.6205 | 0.3554 | 0.1594 | 0.2137 | 0.4235 | 0.6991 | 0.8187 |
0.3086 | 64.0 | 4608 | 0.4023 | 0.3982 | 0.6117 | 0.3588 | 0.1556 | 0.2108 | 0.4381 | 0.7002 | 0.8248 |
0.3143 | 65.0 | 4680 | 0.4036 | 0.4084 | 0.6250 | 0.3566 | 0.1600 | 0.2157 | 0.4323 | 0.6946 | 0.8094 |
0.3031 | 66.0 | 4752 | 0.4012 | 0.3999 | 0.6170 | 0.3551 | 0.1559 | 0.2122 | 0.4458 | 0.7044 | 0.8200 |
0.3279 | 67.0 | 4824 | 0.4031 | 0.4001 | 0.6160 | 0.3609 | 0.1562 | 0.2129 | 0.4421 | 0.7042 | 0.8205 |
0.3173 | 68.0 | 4896 | 0.4000 | 0.3989 | 0.6141 | 0.3569 | 0.1557 | 0.2120 | 0.4456 | 0.7040 | 0.8226 |
0.3203 | 69.0 | 4968 | 0.3989 | 0.3995 | 0.6153 | 0.3545 | 0.1556 | 0.2114 | 0.4421 | 0.7069 | 0.8215 |
0.3165 | 70.0 | 5040 | 0.3984 | 0.3993 | 0.6144 | 0.3513 | 0.1558 | 0.2111 | 0.4450 | 0.7027 | 0.8222 |
0.3278 | 71.0 | 5112 | 0.3993 | 0.4032 | 0.6191 | 0.3509 | 0.1574 | 0.2124 | 0.4386 | 0.7007 | 0.8184 |
0.3232 | 72.0 | 5184 | 0.3990 | 0.4000 | 0.6149 | 0.3534 | 0.1561 | 0.2112 | 0.4396 | 0.7018 | 0.8223 |
0.3089 | 73.0 | 5256 | 0.3996 | 0.4022 | 0.6172 | 0.3526 | 0.1571 | 0.2121 | 0.4370 | 0.7011 | 0.8197 |
0.3118 | 74.0 | 5328 | 0.3994 | 0.4016 | 0.6164 | 0.3530 | 0.1570 | 0.2121 | 0.4375 | 0.7026 | 0.8195 |
0.3161 | 75.0 | 5400 | 0.3996 | 0.4013 | 0.6161 | 0.3535 | 0.1568 | 0.2121 | 0.4381 | 0.7025 | 0.8196 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
- Downloads last month
- 0