@@ -52,14 +52,15 @@ as a part of [Intel® Distribution of OpenVINO™].
5252## Build on Linux\* Systems
5353
5454The software was validated on:
55+ - Ubuntu\* 18.04 (64-bit) with default GCC\* 7.5.0
5556- Ubuntu\* 16.04 (64-bit) with default GCC\* 5.4.0
5657- CentOS\* 7.4 (64-bit) with default GCC\* 4.8.5
5758
5859### Software Requirements
5960- [ CMake] \* 3.11 or higher
6061- GCC\* 4.8 or higher to build the Inference Engine
61- - Python 2.7 or higher for Inference Engine Python API wrapper
62- - (Optional) [ Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 20.13.16352 ] .
62+ - Python 3.5 or higher for Inference Engine Python API wrapper
63+ - (Optional) [ Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.41.14441 ] .
6364
6465### Build Steps
65661 . Clone submodules:
@@ -77,7 +78,7 @@ The software was validated on:
7778 ` ` `
78793. By default, the build enables the Inference Engine GPU plugin to infer models
7980 on your Intel® Processor Graphics. This requires you to
80- [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 20.13.16352 ]
81+ [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.41.14441 ]
8182 before running the build. If you don' t want to use the GPU plugin, use the
8283 `-DENABLE_CLDNN=OFF` CMake build option and skip the installation of the
8384 Intel® Graphics Compute Runtime for OpenCL™ Driver.
@@ -202,7 +203,7 @@ Native compilation of the Inference Engine is the most straightforward solution.
202203
203204 This compilation was tested on the following configuration:
204205
205- * Host: Ubuntu\* 16 .04 (64-bit, Intel® Core™ i7-6700K CPU @ 4.00GHz × 8)
206+ * Host: Ubuntu\* 18 .04 (64-bit, Intel® Core™ i7-6700K CPU @ 4.00GHz × 8)
206207 * Target: Raspbian\* Stretch (32-bit, ARMv7, Raspberry Pi\* 3)
207208
2082091. Install Docker\* :
@@ -337,7 +338,7 @@ The software was validated on:
337338- [CMake]\*3.11 or higher
338339- Microsoft\* Visual Studio 2017, 2019 or [Intel® C++ Compiler] 18.0
339340- (Optional) Intel® Graphics Driver for Windows* (26.20) [driver package].
340- - Python 3.4 or higher for Inference Engine Python API wrapper
341+ - Python 3.5 or higher for Inference Engine Python API wrapper
341342
342343### Build Steps
343344
@@ -454,7 +455,7 @@ The software was validated on:
454455
455456- [CMake]\* 3.11 or higher
456457- Clang\* compiler from Xcode\* 10.1 or higher
457- - Python\* 3.4 or higher for the Inference Engine Python API wrapper
458+ - Python\* 3.5 or higher for the Inference Engine Python API wrapper
458459
459460### Build Steps
460461
@@ -574,8 +575,7 @@ This section describes how to build Inference Engine for Android x86 (64-bit) op
574575
575576## Use Custom OpenCV Builds for Inference Engine
576577
577- > **NOTE**: The recommended and tested version of OpenCV is 4.3. The minimum
578- supported version is 3.4.0.
578+ > **NOTE**: The recommended and tested version of OpenCV is 4.4.0.
579579
580580Required versions of OpenCV packages are downloaded automatically during the
581581building Inference Engine library. If the build script can not find and download
@@ -691,7 +691,7 @@ This target collects all dependencies, prepares the nGraph package and copies it
691691
692692[Intel® Distribution of OpenVINO™]:https://software.intel.com/en-us/openvino-toolkit
693693[CMake]:https://cmake.org/download/
694- [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 20.13.16352 ]:https://github.com/intel/compute-runtime/releases/tag/20.13.16352
694+ [Install Intel® Graphics Compute Runtime for OpenCL™ Driver package 19.41.14441 ]:https://github.com/intel/compute-runtime/releases/tag/19.41.14441
695695[MKL-DNN repository]:https://github.com/intel/mkl-dnn/releases/download/v0.19/mklml_lnx_2019.0.5.20190502.tgz
696696[MKL-DNN repository for Windows]:(https://github.com/intel/mkl-dnn/releases/download/v0.19/mklml_win_2019.0.5.20190502.zip)
697697[OpenBLAS]:https://sourceforge.net/projects/openblas/files/v0.2.14/OpenBLAS-v0.2.14-Win64-int64.zip/download
0 commit comments