Skip to content

A question about the training process #15

@HX-idiot

Description

@HX-idiot

Hello brother, it's really a good job. But what confused me is that in training step 2, when we need to train a weight generator, you keep on training the weight_base, it seems weight_base has already trained well in step 1(pretrian step) , so is there any special reason for this operators? how can we ensure the compatibility between weight_base and generated parameters , and the compatibility between generated parameters?
Also, can such method be used in situations when N is very large(N-way K-shot)? In extreme cases,maybe N is larger than the number of weight_base, If possible, I hope you can give me some suggestions.
Thank you~

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions