Releases: intel/ai-reference-models
Releases · intel/ai-reference-models
Model Zoo for Intel® Architecture v2.11.0
Supported Frameworks
- Intel® Optimizations for TensorFlow
v2.12.0
- Intel® Optimizations for TensorFlow
v2.11.dev202242
for optimized performance on Sapphire Rapids - Intel® Extension for TensorFlow
v1.2.0
- Intel® Extension for PyTorch
v2.0.0+cpu
- Intel® Extension for PyTorch
v1.13.120+xpu
New models
- New precisions
FP16
andBFloat16
for different workloads
New features
- Intel® Data Center GPU Flex and Max Series workloads validated with Intel® Extension for PyTorch
v1.13.120+xpu
and Intel® Extension for TensorFlowv1.2.0
. - Intel® Cloud Data Connector, a tool that helps to use cloud storage tools as AWS Buckets, Google Storage and Azure Storage. Also helps to configure Machine Learning jobs on AzureML. This tool is a helper to use cloud services in Machine Learning process, also provides a common way to interact between cloud providers.
- Dataset Downloader command line interface, a tool to download and apply the preprocessing needed for the list of supported datasets.
Bug fixes:
- This release contains many bug fixes to the previous versions. Please see the commit history here: https://github.com/IntelAI/models/commits/v2.11.0
Supported Configurations
Intel Model Zoo v2.11.0 is validated on the following environment:
- Ubuntu 22.04 LTS
- Ubuntu 20.04 LTS
- Windows 11
- Windows Subsystem for Linux 2 (WSL2)
- Python 3.8, 3.9
Model Zoo for Intel® Architecture v2.7.0
Supported Frameworks
- TensorFlow
v2.8.0
- PyTorch
v1.11.0
andIPEX
v1.11.0
New models
- N/A
New features
Transfer Learning
notebooks forNLP
andComputer Vision
: https://github.com/IntelAI/models/tree/v2.7.0/docs/notebooks/transfer_learning- Consolidate
TensorFlow
andPyTorch
benchmark tables based on the use case: https://github.com/IntelAI/models/tree/v2.7.0#use-cases - Added links for required dataset for each use case: https://github.com/IntelAI/models/tree/v2.7.0/benchmarks
- Initial support for running several models on
Windows
platform: https://github.com/IntelAI/models/blob/master/docs/general/tensorflow/Windows.md - Experimental support for running models on
CentOS 8 Stream
,Red Hat 8
andSLES 15
Bug fixes:
- This release contains many bug fixes to the previous versions. Please see the commit history here: https://github.com/IntelAI/models/commits/v2.7.0
Supported Configurations
Intel Model Zoo 2.7.0 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8, 3.9
- Docker Server v19+
- Docker Client v18+
Model Zoo for Intel® Architecture v2.6.1
Features and bug fixes
- Update
ImageNet
dataset preprocessing instructions here: datasets/imagenet
Supported Configurations
Intel Model Zoo 2.6.1 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8, 3.9
- Docker Server v19+
- Docker Client v18+
Model Zoo for Intel® Architecture v2.6.0
TensorFlow Framework
- Support for TensorFlow
v2.7.0
New TensorFlow models
- N/A
Other features and bug fixes for TensorFlow models
- Updates to only use docker
--privileged
when required and check--cpuset
-
- Except for
BERT Large
andWide and Deep
models
- Except for
- Updated the ImageNet download link
- Fix
platform_util.py
for systems with only one socket or subset of cores within a socket - Replace
USE_DAAL4PY_SKLEARN
env var withpatch_sklearn
- Add error handling for when a frozen graph isn't passed for BERT large FP32 inference*
PyTorch Framework
- Support for PyTorch
v1.10.0
andIPEX
v1.10.0
New PyTorch models
GoogLeNet
Inference(FP32, BFloat16**)Inception v3
Inference(FP32, BFloat16**)MNASNet 0.5
Inference(FP32, BFloat16**)MNASNet 1.0
Inference(FP32, BFloat16**)ResNet 50
Inference(Int8)ResNet 50
Training(FP32, BFloat16**)ResNet 101
Inference(FP32, BFloat16**)ResNet 152
Inference(FP32, BFloat16**)ResNext 32x4d
Inference(FP32, BFloat16**)ResNext 32x16d
Inference(FP32, Int8, BFloat16**)VGG-11
Inference(FP32, BFloat16**)VGG-11
with batch normalization Inference(FP32, BFloat16**)Wide ResNet-50-2
Inference(FP32, BFloat16**)Wide ResNet-101-2
Inference(FP32, BFloat16**)BERT base
Inference(FP32, BFloat16**)BERT large
Inference(FP32, Int8, BFloat16**)BERT large
Training(FP32, BFloat16**)DistilBERT base
Inference(FP32, BFloat16**)RNN-T
Inference(FP32, BFloat16**)RNN-T
Training(FP32, BFloat16**)RoBERTa base
Inference(FP32, BFloat16**)Faster R-CNN ResNet50
FPN Inference(FP32Mask R-CNN
Inference(FP32, BFloat16**)Mask R-CNN
Training(FP32, BFloat16**)Mask R-CNN ResNet50 FPN
Inference(FP32)RetinaNet ResNet-50 FPN
Inference(FP32)SSD-ResNet34
Inference(FP32, Int8, BFloat16**)SSD-ResNet34
Training(FP32, BFloat16**)DLRM
Inference(FP32, Int8, BFloat16**)DLRM
Training(FP32)
Other features and bug fixes for PyTorch models
DLRM
andResNet 50
documentation updates
Supported Configurations
Intel Model Zoo 2.6.0 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8, 3.9
- Docker Server v19+
- Docker Client v18+
Intel Model Zoo v2.5.0
New Functionality
New Models
ML-Perf Transformer-LT
Training (FP32 and BFloat16)ML-Perf Transformer-LT
Inference (FP32, BFloat16 and INT8)ML-Perf 3D-Unet
Inference (FP32, BFloat16 and INT8)DIEN
Training (FP32)DIEN
Inference (FP32 and BFloat16)
Other features and bug fixes
- Added IPython Notebook with
BERT
classifier fine tuning using IMDb - Documentation for creating an
LPOT
Container with Intel® Optimizations for TensorFlow - Advanced documentation for wide deep large ds fp32 training
- Increase Unit testing coverage
DL Frameworks (TensorFlow)
- Support for TensorFlow
v2.6.0
and TensorFlow Servingv2.6.0
DL Frameworks (PyTorch)
- Support for PyTorch
v1.9.0
andIPEX
v1.9.0
Supported Configurations
Intel Model Zoo 2.5.0 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8
- Docker Server v19+
- Docker Client v18+
v2.4.0
New Functionality
- Added links to Intel oneContainer Portal
- Added documentation for running most workflows inside Intel® oneAPI AI Analytics Toolkit
- Experimental support for running workflows on
CentOS 8
DL Frameworks (TensorFlow)
- Support for TensorFlow
v2.5.0
and TensorFlow Servingv2.5.1
Supported Configurations
Intel Model Zoo 2.4 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8
- Docker Server v19+
- Docker Client v18+