Skip to content

how are the number of encoder/decoder hidden units calculated? #15

@qiutiantxwd

Description

@qiutiantxwd

Hi authors, thanks a lot for sharing the code.
I want to understand for HQA for MNIST, what does the hidden units for each layer denote?
In the code, I see the number of hidden units for each layer's encoder is [16, 16, 32, 64, 128]. I understand this to be the number of output channels of the middle layer of each 3-layer feedforward encoder network?
For decoder, its the number of output channels of the first layer in the decoder after quantization layer, which is [16, 64, 256, 512, 1024].
Is this the case?

Secondly, in Table 10 from the paper, the number of hidden layers units for each layer's encoder is [16, 16, 32, 48, 80], while for decoders, it's [16, 32, 48, 80, 128]. These don't match the numbers in the code, am I reading them incorrectly? Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions