-
Notifications
You must be signed in to change notification settings - Fork 37
regression test #234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
regression test #234
Conversation
Summary: - change to pass multiple outer optimizers in the tests
Summary: - move the training loop to a separate file - convert it into a class so that methods can be overridden without having to duplicate code
Summary: - since we made a simplifying assumption that we will only ever have 1 inflight fragment, we can simplify some of the logic particularly getting rid of the local step in manager state - we'll just use the manager's step to determine which fragment to sync - this also allows us to easily support heterogenous hardware by tuning the sync_every setting that will make slower/faster machines to perform less/more local steps before they sync - we can also perform quorum right before preparing a fragment sync - this easily ensures that all replicas will have the same max step and sync the same fragment - fix some numeric issues - the sign of the pseudogradient - inplace lerp when mixing local and global parameters
Summary: - add a test that uses fixtures to validate against previous runs of the test - the fixutres can be written using `WRITE_FIXTURE=true pytest -vs ./torchft/test_diloco_mocked_updates.py` - when writing fixtures, the test also numerically validates the implementation of streaming diloco
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we move these to a /test/* dir? It might get real annoying to grep through torchft/ if there's a bunch of testing specific autogenned files
"0": { | ||
"layers.0.weight": [ | ||
[ | ||
1.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it expected that all of these are whole numbers? I would expect the weights to be a lot more varied?
Summary:
WRITE_FIXTURE=true pytest -vs ./torchft/test_diloco_mocked_updates.py
Stack created with Sapling. Best reviewed with ReviewStack.