You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for creating this repo! It helps me a lot during training.
I wanted to ask/suggest if it was possible the change the initialization of the optimizer to be consistent with pytorch.optim.Optimizer.
In documentation (https://pytorch.org/docs/stable/optim.html) it says that optimizer should be given iterable with params during initialization. Because it's not like that right now, it makes it impossible to use this implementation with for example horovod DistributedOptimizer.
Thank you!
The text was updated successfully, but these errors were encountered:
Thanks for creating this repo! It helps me a lot during training.
I wanted to ask/suggest if it was possible the change the initialization of the optimizer to be consistent with pytorch.optim.Optimizer.
In documentation (https://pytorch.org/docs/stable/optim.html) it says that optimizer should be given iterable with params during initialization. Because it's not like that right now, it makes it impossible to use this implementation with for example horovod DistributedOptimizer.
Thank you!
The text was updated successfully, but these errors were encountered: