Edit model card

glpn-nyu-finetuned-diode-221223-094145

This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4077
  • Mae: 0.4032
  • Rmse: 0.6201
  • Abs Rel: 0.3554
  • Log Mae: 0.1594
  • Log Rmse: 0.2173
  • Delta1: 0.4530
  • Delta2: 0.6868
  • Delta3: 0.8071

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 24
  • eval_batch_size: 48
  • seed: 2022
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.15
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Mae Rmse Abs Rel Log Mae Log Rmse Delta1 Delta2 Delta3
1.0433 1.0 72 0.5885 0.5648 0.7732 0.4665 0.2691 0.3222 0.2134 0.4070 0.5668
0.4529 2.0 144 0.4284 0.4217 0.6232 0.3846 0.1702 0.2214 0.3935 0.6428 0.7958
0.415 3.0 216 0.4221 0.4049 0.6164 0.3800 0.1603 0.2180 0.4499 0.6735 0.8070
0.3643 4.0 288 0.4430 0.4172 0.6176 0.4419 0.1671 0.2265 0.4208 0.6489 0.8077
0.3927 5.0 360 0.4186 0.4072 0.6199 0.3646 0.1623 0.2199 0.4362 0.6675 0.8077
0.389 6.0 432 0.4093 0.4018 0.6168 0.3515 0.1592 0.2155 0.4499 0.6753 0.8111
0.3521 7.0 504 0.4320 0.4112 0.6165 0.4061 0.1646 0.2226 0.4358 0.6569 0.8062
0.3324 8.0 576 0.4056 0.3977 0.6132 0.3570 0.1566 0.2148 0.4556 0.7006 0.8157
0.3183 9.0 648 0.4187 0.4036 0.6151 0.3667 0.1607 0.2172 0.4472 0.6664 0.8095
0.3052 10.0 720 0.4149 0.4031 0.6171 0.3683 0.1601 0.2191 0.4469 0.6815 0.8073
0.3071 11.0 792 0.4168 0.4111 0.6252 0.3587 0.1647 0.2218 0.4322 0.6643 0.8019
0.3358 12.0 864 0.4161 0.4029 0.6171 0.3650 0.1600 0.2189 0.4507 0.6789 0.8092
0.3385 13.0 936 0.4116 0.4051 0.6215 0.3565 0.1609 0.2190 0.4478 0.6770 0.8053
0.316 14.0 1008 0.4092 0.3982 0.6138 0.3618 0.1569 0.2157 0.4577 0.6951 0.8109
0.3301 15.0 1080 0.4159 0.4056 0.6199 0.3654 0.1619 0.2204 0.4462 0.6743 0.8056
0.3076 16.0 1152 0.4130 0.4051 0.6200 0.3612 0.1612 0.2195 0.4470 0.6787 0.8076
0.3001 17.0 1224 0.4134 0.4071 0.6244 0.3579 0.1621 0.2210 0.4487 0.6771 0.8045
0.3293 18.0 1296 0.4091 0.4031 0.6182 0.3552 0.1601 0.2174 0.4501 0.6786 0.8065
0.3023 19.0 1368 0.4089 0.3990 0.6143 0.3633 0.1573 0.2160 0.4518 0.6966 0.8137
0.3288 20.0 1440 0.4067 0.4006 0.6166 0.3538 0.1580 0.2155 0.4529 0.6895 0.8122
0.2988 21.0 1512 0.4061 0.4060 0.6221 0.3491 0.1614 0.2183 0.4480 0.6777 0.8059
0.3037 22.0 1584 0.4081 0.4025 0.6204 0.3582 0.1587 0.2174 0.4523 0.6905 0.8093
0.3284 23.0 1656 0.4080 0.4062 0.6209 0.3545 0.1615 0.2184 0.4409 0.6794 0.8060
0.3261 24.0 1728 0.4092 0.4044 0.6208 0.3562 0.1602 0.2183 0.4512 0.6807 0.8061
0.3039 25.0 1800 0.4079 0.4005 0.6159 0.3576 0.1585 0.2167 0.4611 0.6827 0.8095
0.2843 26.0 1872 0.4072 0.4045 0.6212 0.3548 0.1603 0.2182 0.4502 0.6856 0.8079
0.2828 27.0 1944 0.4110 0.4089 0.6248 0.3578 0.1631 0.2211 0.4419 0.6756 0.8031
0.3212 28.0 2016 0.4063 0.3981 0.6148 0.3547 0.1569 0.2157 0.4651 0.6891 0.8102
0.2936 29.0 2088 0.4087 0.4099 0.6243 0.3547 0.1638 0.2202 0.4366 0.6711 0.8038
0.2999 30.0 2160 0.4067 0.3996 0.6161 0.3547 0.1581 0.2166 0.4624 0.6880 0.8082
0.3052 31.0 2232 0.4044 0.3983 0.6149 0.3517 0.1571 0.2149 0.4591 0.6923 0.8124
0.3082 32.0 2304 0.4069 0.4044 0.6224 0.3530 0.1597 0.2179 0.4533 0.6872 0.8058
0.3077 33.0 2376 0.4072 0.4061 0.6218 0.3545 0.1612 0.2189 0.4462 0.6821 0.8057
0.3043 34.0 2448 0.4063 0.4002 0.6170 0.3551 0.1579 0.2166 0.4575 0.6932 0.8101
0.2933 35.0 2520 0.4097 0.4054 0.6228 0.3562 0.1606 0.2188 0.4485 0.6857 0.8051
0.2996 36.0 2592 0.4059 0.4025 0.6194 0.3544 0.1590 0.2171 0.4522 0.6902 0.8087
0.3123 37.0 2664 0.4058 0.4024 0.6207 0.3538 0.1588 0.2171 0.4573 0.6893 0.8079
0.318 38.0 2736 0.4069 0.4028 0.6187 0.3555 0.1594 0.2172 0.4528 0.6876 0.8075
0.2938 39.0 2808 0.4065 0.4031 0.6228 0.3557 0.1584 0.2167 0.4545 0.6902 0.8096
0.294 40.0 2880 0.4059 0.4003 0.6170 0.3570 0.1577 0.2162 0.4576 0.6940 0.8098
0.3139 41.0 2952 0.4072 0.4048 0.6202 0.3556 0.1605 0.2181 0.4484 0.6847 0.8075
0.2953 42.0 3024 0.4080 0.4042 0.6208 0.3560 0.1598 0.2176 0.4514 0.6855 0.8067
0.3093 43.0 3096 0.4076 0.4040 0.6216 0.3553 0.1596 0.2180 0.4532 0.6871 0.8076
0.2843 44.0 3168 0.4073 0.4058 0.6225 0.3547 0.1609 0.2183 0.4482 0.6816 0.8070
0.3064 45.0 3240 0.4069 0.4047 0.6215 0.3545 0.1601 0.2179 0.4512 0.6856 0.8076
0.3027 46.0 3312 0.4073 0.4042 0.6228 0.3557 0.1596 0.2179 0.4542 0.6880 0.8075
0.304 47.0 3384 0.4069 0.4063 0.6239 0.3546 0.1609 0.2186 0.4481 0.6829 0.8059
0.297 48.0 3456 0.4063 0.4032 0.6202 0.3550 0.1590 0.2171 0.4543 0.6879 0.8089
0.3036 49.0 3528 0.4057 0.4031 0.6217 0.3545 0.1588 0.2170 0.4551 0.6896 0.8093
0.2949 50.0 3600 0.4077 0.4032 0.6201 0.3554 0.1594 0.2173 0.4530 0.6868 0.8071

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1+cu116
  • Datasets 2.8.0
  • Tokenizers 0.13.2
Downloads last month
0