You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Initial incorporation of a general training loop (tensorflow#586)
This is the initial incorporation of a general callback-based training loop, originally designed by @sgugger and proposed as the DifferentiableStep option here. As a first step, the following models have been converted to use this new training loop in place of the previous custom loop:
LeNet-MNIST
ResNet-CIFAR10
MobileNetV1-Imagenette
MobileNetV2-Imagenette
An initial set of callbacks have been provided that draw an animated progress bar on the console during training, and display the average loss and top-1 classification accuracy. These metric updates can either be continuous during training and validation, or can appear only at the end of an epoch (this is a performance option, because currently training will slow by up to 30% if continuous updates are enabled). Which metrics to display, if any, are also configurable.
By default, X10 is used where available for training models, and this loop fully supports X10 or eager mode devices.
As a next step, all but one or two classification examples will be reworked to use this loop, and timing functionality will be introduced to have this be the default loop within our benchmarks.
This pull request is now ready for review.
0 commit comments