This document provides links to step-by-step instructions on how to leverage Model Zoo docker containers to run optimized open-source Deep Learning Training and Inference workloads using TensorFlow framework on 4th Generation Intel® Xeon® Scalable processors.
The tables below provide links to run each use case using docker containers. The model scripts run on Linux.
Model | Model Documentation | Dataset |
---|---|---|
ResNet 50v1.5 | Training | ImageNet 2012 |
ResNet 50v1.5 | Inference | ImageNet 2012 |
MobileNet V1* | Inference | ImageNet 2012 |
Model | Model Documentation | Dataset |
---|---|---|
3D U-Net MLPerf* | Inference | BRATS 2019 |
Model | Model Documentation | Dataset |
---|---|---|
SSD-ResNet34 | Training | COCO 2017 training dataset |
SSD-ResNet34 | Inference | COCO 2017 validation dataset |
SSD-MobileNet* | Inference | COCO 2017 validation dataset |
Model | Model Documentation | Dataset |
---|---|---|
BERT large | Training | SQuAD and MRPC |
BERT large | Inference | SQuAD |
Model | Model Documentation | Dataset |
---|---|---|
Transformer_LT_mlperf* | Training | WMT English-German dataset |
Transformer_LT_mlperf* | Inference | WMT English-German dataset |
Model | Model Documentation | Dataset |
---|---|---|
DIEN | Training | DIEN dataset |
DIEN | Inference | DIEN dataset |