You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, Thank you for your excellent work on the HRDecoder!
I have a question regarding the memory computation: is it calculated specifically when the batch size is set to 1? Is it measured only in the test phase? Additionally, since the DDR and IDRID datasets have different input resolutions and crop numbers, which is used as the standard for the memory computation?
Plus, HRDecoder actually handles a 960x960 downsampled input when hr_scale is set to (960, 960). Which input size did you set for other models when making the comparison in Figure 1a and Table 2?
Thank you for your help!
The text was updated successfully, but these errors were encountered:
Hi, Thank you for your excellent work on the HRDecoder!
I have a question regarding the memory computation: is it calculated specifically when the batch size is set to 1? Is it measured only in the test phase? Additionally, since the DDR and IDRID datasets have different input resolutions and crop numbers, which is used as the standard for the memory computation?
Plus, HRDecoder actually handles a 960x960 downsampled input when hr_scale is set to (960, 960). Which input size did you set for other models when making the comparison in Figure 1a and Table 2?
Thank you for your help!
The text was updated successfully, but these errors were encountered: