Releases: intel/ai-reference-models
Model Zoo for Intel® Architecture v2.7.0
Supported Frameworks
- TensorFlow
v2.8.0
- PyTorch
v1.11.0
andIPEX
v1.11.0
New models
- N/A
New features
Transfer Learning
notebooks forNLP
andComputer Vision
: https://github.com/IntelAI/models/tree/v2.7.0/docs/notebooks/transfer_learning- Consolidate
TensorFlow
andPyTorch
benchmark tables based on the use case: https://github.com/IntelAI/models/tree/v2.7.0#use-cases - Added links for required dataset for each use case: https://github.com/IntelAI/models/tree/v2.7.0/benchmarks
- Initial support for running several models on
Windows
platform: https://github.com/IntelAI/models/blob/master/docs/general/tensorflow/Windows.md - Experimental support for running models on
CentOS 8 Stream
,Red Hat 8
andSLES 15
Bug fixes:
- This release contains many bug fixes to the previous versions. Please see the commit history here: https://github.com/IntelAI/models/commits/v2.7.0
Supported Configurations
Intel Model Zoo 2.7.0 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8, 3.9
- Docker Server v19+
- Docker Client v18+
Model Zoo for Intel® Architecture v2.6.1
Features and bug fixes
- Update
ImageNet
dataset preprocessing instructions here: datasets/imagenet
Supported Configurations
Intel Model Zoo 2.6.1 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8, 3.9
- Docker Server v19+
- Docker Client v18+
Model Zoo for Intel® Architecture v2.6.0
TensorFlow Framework
- Support for TensorFlow
v2.7.0
New TensorFlow models
- N/A
Other features and bug fixes for TensorFlow models
- Updates to only use docker
--privileged
when required and check--cpuset
-
- Except for
BERT Large
andWide and Deep
models
- Except for
- Updated the ImageNet download link
- Fix
platform_util.py
for systems with only one socket or subset of cores within a socket - Replace
USE_DAAL4PY_SKLEARN
env var withpatch_sklearn
- Add error handling for when a frozen graph isn't passed for BERT large FP32 inference*
PyTorch Framework
- Support for PyTorch
v1.10.0
andIPEX
v1.10.0
New PyTorch models
GoogLeNet
Inference(FP32, BFloat16**)Inception v3
Inference(FP32, BFloat16**)MNASNet 0.5
Inference(FP32, BFloat16**)MNASNet 1.0
Inference(FP32, BFloat16**)ResNet 50
Inference(Int8)ResNet 50
Training(FP32, BFloat16**)ResNet 101
Inference(FP32, BFloat16**)ResNet 152
Inference(FP32, BFloat16**)ResNext 32x4d
Inference(FP32, BFloat16**)ResNext 32x16d
Inference(FP32, Int8, BFloat16**)VGG-11
Inference(FP32, BFloat16**)VGG-11
with batch normalization Inference(FP32, BFloat16**)Wide ResNet-50-2
Inference(FP32, BFloat16**)Wide ResNet-101-2
Inference(FP32, BFloat16**)BERT base
Inference(FP32, BFloat16**)BERT large
Inference(FP32, Int8, BFloat16**)BERT large
Training(FP32, BFloat16**)DistilBERT base
Inference(FP32, BFloat16**)RNN-T
Inference(FP32, BFloat16**)RNN-T
Training(FP32, BFloat16**)RoBERTa base
Inference(FP32, BFloat16**)Faster R-CNN ResNet50
FPN Inference(FP32Mask R-CNN
Inference(FP32, BFloat16**)Mask R-CNN
Training(FP32, BFloat16**)Mask R-CNN ResNet50 FPN
Inference(FP32)RetinaNet ResNet-50 FPN
Inference(FP32)SSD-ResNet34
Inference(FP32, Int8, BFloat16**)SSD-ResNet34
Training(FP32, BFloat16**)DLRM
Inference(FP32, Int8, BFloat16**)DLRM
Training(FP32)
Other features and bug fixes for PyTorch models
DLRM
andResNet 50
documentation updates
Supported Configurations
Intel Model Zoo 2.6.0 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8, 3.9
- Docker Server v19+
- Docker Client v18+
Intel Model Zoo v2.5.0
New Functionality
New Models
ML-Perf Transformer-LT
Training (FP32 and BFloat16)ML-Perf Transformer-LT
Inference (FP32, BFloat16 and INT8)ML-Perf 3D-Unet
Inference (FP32, BFloat16 and INT8)DIEN
Training (FP32)DIEN
Inference (FP32 and BFloat16)
Other features and bug fixes
- Added IPython Notebook with
BERT
classifier fine tuning using IMDb - Documentation for creating an
LPOT
Container with Intel® Optimizations for TensorFlow - Advanced documentation for wide deep large ds fp32 training
- Increase Unit testing coverage
DL Frameworks (TensorFlow)
- Support for TensorFlow
v2.6.0
and TensorFlow Servingv2.6.0
DL Frameworks (PyTorch)
- Support for PyTorch
v1.9.0
andIPEX
v1.9.0
Supported Configurations
Intel Model Zoo 2.5.0 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8
- Docker Server v19+
- Docker Client v18+
v2.4.0
New Functionality
- Added links to Intel oneContainer Portal
- Added documentation for running most workflows inside Intel® oneAPI AI Analytics Toolkit
- Experimental support for running workflows on
CentOS 8
DL Frameworks (TensorFlow)
- Support for TensorFlow
v2.5.0
and TensorFlow Servingv2.5.1
Supported Configurations
Intel Model Zoo 2.4 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.8
- Docker Server v19+
- Docker Client v18+
v2.3.0
New Functionality
Two new TF workload containers and model packages that are available on the Intel® oneContainer Portal:
- 3D U-Net FP32 Inference
- SSD ResNet34 BFloat16 Training
One new PyTorch workload containers and model packages that are available on the Intel® oneContainer Portal:
- DLRM BFloat16 Training
DL Frameworks (TensorFlow)
TensorFlow models in the 2.3 release are validated on the following TensorFlow versions:
- Intel Optimizations for TensorFlow v2.4.0 or v1.15.2 (select models)
- Intel Optimizations for TensorFlow serving v2.3
DL Frameworks (PyTorch)
PyTorch models in the 2.3 release are validated on the following PyTorch version:
- PyTorch v1.5.0-rc3
Supported Configurations
Intel Model Zoo 2.3 is validated on the following environment:
- Ubuntu 20.04 LTS
- Python 3.6
- Docker Server v19+
- Docker Client v18+
Intel Model Zoo 2.2.0 release
New Functionality
Ten new TF workload containers and model packages that are available on the Intel® oneContainer Portal:
- InceptionV4 FP32 Inference
- InceptionV4 Int8 Inference
- ResNet101 FP32 Inference
- ResNet101 Int8 Inference
- Transformer-LT MLPerf BFloat16 Training
- GNMT MLPerf FP32 Inference
- DenseNet169 FP32 Inference
- ResNet50v1.5 BFloat16 Training
- Wide & Deep Large Dataset FP32 Inference
- Wide & Deep Large Dataset Int8 Inference
Two new PyTorch workload containers and model packages that are available on the Intel® oneContainer Portal:
- ResNet50 FP32 Inference
- ResNet50 BFloat16 Inference
Three new TF Kubernetes packages that are available on the Intel® oneContainer Portal:
- BERT Large FP32 Training
- Wide & Deep Large Dataset FP32 Training
- RFCN FP32 Inference
A new Helm chart to deploy TensorFlow Serving on a K8s cluster
DL Frameworks (TensorFlow)
TensorFlow models in the 2.2 release are validated on the following TensorFlow versions:
- Intel Optimizations for TensorFlow v2.3.0 or v1.15.2 (select models)
- Intel Optimizations for TensorFlow serving v2.3
DL Frameworks (PyTorch)
PyTorch models in the 2.2 release are validated on the following PyTorch version:
- PyTorch v1.5.0-rc3
Supported Configurations
Intel Model Zoo 2.2 is validated on the following environment:
- Ubuntu 18.04 LTS
- Python 3.6
- Docker Server v19+
- Docker Client v18+
Intel Model Zoo v2.1.1
New Functionality
One new TensorFlow workload containers that are available on the Intel® oneContainer Portal:
- COCO Validation Dataset Preprocessing
6 new TensorFlow workload containers that are available on the Intel® oneContainer Portal:
-
TensorFlow Performance Comparison Jupyter Notebook
-
Intel PyTorch Extension
-
Intel Optimizations for TensorFlow with Scikit-learn and oneDAL
-
Scikit-learn with oneDAL
-
XGBoost with oneDAL
-
Intel Optimized Analytics Package(OAP) for Spark Platform
One new TensorFlow K8s workflow package that are available on the Intel® oneContainer Portal:
- ResNet50 v1.5 FP32 Training
DL Frameworks (TensorFlow)
TensorFlow models in this release are validated on the following TensorFlow versions:
-
Intel Optimizations for TensorFlow v2.3.0 or v1.15.2 (select models)
-
Intel Optimizations for TensorFlow Serving v2.3.0
DL Frameworks (PyTorch)
-
PyTorch models in this release are validated on the following PyTorch version:
-
PyTorch v1.5.0-rc3
Supported Configurations
-
Intel Model Zoo 2.0 is validated on the following environment:
-
Ubuntu 18.04 LTS
-
Python 3.6
-
Docker Server v19+
-
Docker Client v18+