We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
当我运行MLDG方法在PACS数据集上,此时优化器是默认的SGD ===========epoch 1=========== total_loss:0.8481 train_acc:0.7129,valid_acc:0.7092,target_acc:0.5942 total cost time: 179.1731
但是当我把SGD换成Adam后发现: ===========epoch 1=========== total_loss:4.0192 train_acc:0.1492,valid_acc:0.1555,target_acc:0.1401 total cost time: 203.3469
还有就是algs/opt.py文件中的get_params方法中的 args.lr_decay1 * initlr,初始学习率乘以args.lr_decay1这个参数的作用是?
The text was updated successfully, but these errors were encountered:
还请老师们解惑一下
Sorry, something went wrong.
我已经收到您的邮件,会尽快处理。
我也有类似问题
No branches or pull requests
当我运行MLDG方法在PACS数据集上,此时优化器是默认的SGD
===========epoch 1===========
total_loss:0.8481
train_acc:0.7129,valid_acc:0.7092,target_acc:0.5942
total cost time: 179.1731
但是当我把SGD换成Adam后发现:
===========epoch 1===========
total_loss:4.0192
train_acc:0.1492,valid_acc:0.1555,target_acc:0.1401
total cost time: 203.3469
还有就是algs/opt.py文件中的get_params方法中的 args.lr_decay1 * initlr,初始学习率乘以args.lr_decay1这个参数的作用是?
The text was updated successfully, but these errors were encountered: