Skip to content

Commit

Permalink
[OpenVINO-EP] Update to latest version: OpenVINO 2019 R3.1 (#2308)
Browse files Browse the repository at this point in the history
* Updates OpenVINO EP to latest version: 2019 R3.1

* Reviews fixed

* Update Dockerfile.openvino

* Addressed PR comments and disabled model tests temporarily

* Update Dockerfile.ubuntu_openvino
  • Loading branch information
psfoley authored and jywu-msft committed Nov 6, 2019
1 parent db454be commit 1510757
Show file tree
Hide file tree
Showing 20 changed files with 223 additions and 386 deletions.
12 changes: 6 additions & 6 deletions BUILD.md
Original file line number Diff line number Diff line change
Expand Up @@ -235,16 +235,16 @@ See more information on the OpenVINO Execution Provider [here](./docs/execution_
#### Pre-Requisites
* Install the OpenVINO release along with its dependencies: [Windows]([https://software.intel.com/en-us/openvino-toolkit](https://software.intel.com/en-us/openvino-toolkit), [Linux](https://software.intel.com/en-us/openvino-toolkit).
* For Linux, currently supports and is validated on OpenVINO 2018 R5.0.1 and OpenVINO 2019 R1.1 (Recommended)
* For Windows, download the 2019 R1.1 Windows Installer.
* For Linux, currently supports and is validated on OpenVINO 2019 R3.1
* For Windows, download the 2019 R3.1 Windows Installer.
* Install the model optimizer prerequisites for ONNX by running:
* Windows: `<openvino_install_dir>/deployment_tools/model_optimizer/install_prerequisites/install_prerequisites_onnx.bat`
* Linux: `<openvino_install_dir>/deployment_tools/model_optimizer/install_prerequisites/install_prerequisites_onnx.sh`
* Initialize the OpenVINO environment by running the setupvars in `\<openvino\_install\_directory\>\/bin` using `setupvars.bat` (Windows) or `source setupvars.sh` (Linux)
* To configure Intel<sup>®</sup> Processor Graphics(GPU) please follow these instructions: [Windows](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_installing_openvino_windows.html#Install-GPU), [Linux](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_installing_openvino_linux.html#additional-GPU-steps)
* To configure Intel<sup>®</sup> Movidius<sup>TM</sup> USB, please follow this getting started guide: [Windows](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_installing_openvino_windows.html#usb-myriad), [Linux](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_installing_openvino_linux.html#additional-NCS-steps)
* To configure Intel<sup>®</sup> Vision Accelerator Design based on 8 Movidius<sup>TM</sup> MyriadX VPUs, please follow this configuration guide: [Windows](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_installing_openvino_windows.html#hddl-myriad), [Linux](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_installing_openvino_linux.html#install-VPU)
* To configure Intel<sup>®</sup> Vision Accelerator Design with an Intel<sup>®</sup> Arria<sup>®</sup> 10 FPGA, please follow this configuration guide: [Linux](https://docs.openvinotoolkit.org/2019_R1.1/_docs_install_guides_VisionAcceleratorFPGA_Configure_2019R1.html)
* To configure Intel<sup>®</sup> Processor Graphics(GPU) please follow these instructions: [Windows](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_installing_openvino_windows.html#Install-GPU), [Linux](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_installing_openvino_linux.html#additional-GPU-steps)
* To configure Intel<sup>®</sup> Movidius<sup>TM</sup> USB, please follow this getting started guide: [Windows](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_installing_openvino_windows.html#usb-myriad), [Linux](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_installing_openvino_linux.html#additional-NCS-steps)
* To configure Intel<sup>®</sup> Vision Accelerator Design based on 8 Movidius<sup>TM</sup> MyriadX VPUs, please follow this configuration guide: [Windows](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_installing_openvino_windows.html#hddl-myriad), [Linux](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_installing_openvino_linux.html#install-VPU)
* To configure Intel<sup>®</sup> Vision Accelerator Design with an Intel<sup>®</sup> Arria<sup>®</sup> 10 FPGA, please follow this configuration guide: [Linux](https://docs.openvinotoolkit.org/2019_R3.1/_docs_install_guides_VisionAcceleratorFPGA_Configure_2019R3.html)
#### Build Instructions
Expand Down
13 changes: 1 addition & 12 deletions cmake/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -535,18 +535,7 @@ endif()

if(onnxruntime_USE_OPENVINO)

add_definitions(-DUSE_OPENVINO=1)

if (onnxruntime_USE_OPENVINO_SOURCE)
include(openvino)
list(APPEND onnxruntime_EXTERNAL_DEPENDENCIES project_openvino)
link_directories(${OPENVINO_LIB_DIR})
#TODO: set plugin path var
endif()

if (onnxruntime_USE_OPENVINO_BINARY)
#TODO: set plugin path var
endif()
add_definitions(-DUSE_OPENVINO=1)

if(onnxruntime_USE_OPENVINO_MYRIAD)
add_definitions(-DOPENVINO_CONFIG_MYRIAD=1)
Expand Down
29 changes: 0 additions & 29 deletions cmake/external/openvino.cmake

This file was deleted.

68 changes: 34 additions & 34 deletions cmake/onnxruntime_providers.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -285,37 +285,32 @@ if (onnxruntime_USE_NGRAPH)
endif()

if (onnxruntime_USE_OPENVINO)
file(GLOB_RECURSE onnxruntime_providers_openvino_cc_srcs
"${ONNXRUNTIME_ROOT}/core/providers/openvino/*.h"
"${ONNXRUNTIME_ROOT}/core/providers/openvino/*.cc"
)
file(GLOB_RECURSE onnxruntime_providers_openvino_py_srcs
"${ONNXRUNTIME_ROOT}/core/providers/openvino/openvino_mo/*.py"
)

# Below variables point to directories within the OpenVINO installation directory
# whose value is set in INTEL_CVSDK_DIR variable by running the setupvars.sh script
if (onnxruntime_USE_OPENVINO_BINARY)
if ($ENV{INTEL_CVSDK_DIR} MATCHES "2019.1")
message($ENV{INTEL_CVSDK_DIR})
set(OPENVINO_INCLUDE_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/include)
set(OPENVINO_TBB_INCLUDE_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/include)
if ($ENV{INTEL_CVSDK_DIR} MATCHES "2019.3")
file(GLOB_RECURSE onnxruntime_providers_openvino_cc_srcs
"${ONNXRUNTIME_ROOT}/core/providers/openvino/*.h"
"${ONNXRUNTIME_ROOT}/core/providers/openvino/*.cc"
)
file(GLOB_RECURSE onnxruntime_providers_openvino_py_srcs
"${ONNXRUNTIME_ROOT}/core/providers/openvino/openvino_mo/*.py"
)

set(OPENVINO_INCLUDE_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/include)
set(OPENVINO_EXTENSIONS_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/src/extension)
set(OPENVINO_TBB_INCLUDE_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/include)
if(WIN32)
set(OPENVINO_LIB_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/lib/intel64/Release)
set(OPENVINO_TBB_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/lib/intel64/Release)
set(OPENVINO_MKL_TINY_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/bin/intel64/Release)

set(OPENVINO_MKL_TINY_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/bin/intel64/Release)
else()
set(OPENVINO_LIB_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/lib/intel64/)
set(OPENVINO_TBB_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/lib)
set(OPENVINO_MKL_TINY_DIR $ENV{INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/mkltiny_lnx/lib)
endif()
else()
message(FATAL_ERROR "OpenVINO 2019 R3.1 must be installed with environment variables set before building ONNX Runtime")
endif()
if ($ENV{INTEL_CVSDK_DIR} MATCHES "2018.5")
set(OPENVINO_INCLUDE_DIR $ENV{INTEL_CVSDK_DIR}/deployment_tools/inference_engine/include)
set(OPENVINO_LIB_DIR $ENV{INTEL_CVSDK_DIR}/deployment_tools/inference_engine/lib/ubuntu_16.04/intel64/)
endif()
endif()

find_package(PythonLibs REQUIRED)
source_group(TREE ${ONNXRUNTIME_ROOT}/core FILES ${onnxruntime_providers_openvino_cc_srcs})
Expand All @@ -326,31 +321,36 @@ endif()
install(DIRECTORY ${PROJECT_SOURCE_DIR}/../include/onnxruntime/core/providers/openvino DESTINATION ${CMAKE_INSTALL_INCLUDEDIR}/onnxruntime/core/providers)
set_target_properties(onnxruntime_providers_openvino PROPERTIES LINKER_LANGUAGE CXX)
if (WIN32)
target_include_directories(onnxruntime_providers_openvino SYSTEM PUBLIC ${ONNXRUNTIME_ROOT} ${eigen_INCLUDE_DIRS} ${OPENVINO_INCLUDE_DIR} ${OPENVINO_TBB_INCLUDE_DIR} ${PYTHON_INCLUDE_DIRS} ${PYTHONPATH})
target_include_directories(onnxruntime_providers_openvino SYSTEM PUBLIC ${ONNXRUNTIME_ROOT} ${eigen_INCLUDE_DIRS} ${OPENVINO_INCLUDE_DIR} ${OPENVINO_EXTENSIONS_DIR} ${OPENVINO_TBB_INCLUDE_DIR} ${PYTHON_INCLUDE_DIRS} ${PYTHONPATH})
#${pybind11_INCLUDE_DIRS}
else()
target_include_directories(onnxruntime_providers_openvino SYSTEM PUBLIC ${ONNXRUNTIME_ROOT} ${eigen_INCLUDE_DIRS} ${OPENVINO_INCLUDE_DIR} ${OPENVINO_TBB_INCLUDE_DIR} ${PYTHON_INCLUDE_DIRS})
target_include_directories(onnxruntime_providers_openvino SYSTEM PUBLIC ${ONNXRUNTIME_ROOT} ${eigen_INCLUDE_DIRS} ${OPENVINO_INCLUDE_DIR} ${OPENVINO_EXTENSIONS_DIR} ${OPENVINO_LIB_DIR} ${OPENVINO_TBB_INCLUDE_DIR} ${PYTHON_INCLUDE_DIRS})
endif()

if ($ENV{INTEL_CVSDK_DIR} MATCHES "2019.1")
if (WIN32)

if (WIN32)
string(REPLACE "include" "libs" PYTHON_LIB ${PYTHON_INCLUDE_DIRS})
find_package(InferenceEngine 2.1 REQUIRED)
set(PYTHON_LIBRARIES ${PYTHON_LIB})
set(OPENVINO_CPU_EXTENSION_DIR ${onnxruntime_BINARY_DIR}/ie_cpu_extension/${CMAKE_BUILD_TYPE})
set(OPENVINO_CPU_EXTENSION_LIB cpu_extension.dll)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /wd4996 /wd4244 /wd4267 /wd4099 /wd4551 /wd4505 /wd4515 /wd4706 /wd4456 /w")
set_target_properties(ie_cpu_extension PROPERTIES COMPILE_FLAGS "/wd4244 /wd4456 /wd4458 /wd4701")
link_directories(onnxruntime_providers_openvino -linference_engine ${PYTHON_LIBRARIES} ${OPENVINO_LIB_DIR} ${OPENVINO_TBB_DIR} ${OPENVINO_MKL_TINY_DIR} ${PYTHONPATH})
target_link_libraries(onnxruntime_providers_openvino $ENV{INTEL_CVSDK_DIR}/deployment_tools/inference_engine/lib/intel64/Release/inference_engine.lib)
target_link_libraries(onnxruntime_providers_openvino $ENV{INTEL_CVSDK_DIR}/deployment_tools/inference_engine/lib/intel64/Release/inference_engine.lib IE::ie_cpu_extension ${PYTHON_LIBRARIES})
file(COPY ${onnxruntime_providers_openvino_py_srcs} DESTINATION ${onnxruntime_BINARY_DIR}/${CMAKE_BUILD_TYPE})
else()
link_directories(onnxruntime_providers_openvino ${OPENVINO_LIB_DIR} ${OPENVINO_TBB_DIR} ${OPENVINO_MKL_TINY_DIR})
target_link_libraries(onnxruntime_providers_openvino -linference_engine -ltbb ${PYTHON_LIBRARIES})
file(COPY ${onnxruntime_providers_openvino_py_srcs} DESTINATION ${onnxruntime_BINARY_DIR})
find_package(InferenceEngine 2.1 REQUIRED)
set(OPENVINO_CPU_EXTENSION_LIB libcpu_extension.so)
link_directories(onnxruntime_providers_openvino ${OPENVINO_LIB_DIR} ${OPENVINO_TBB_DIR} ${OPENVINO_MKL_TINY_DIR})
if ($ENV{INTEL_CVSDK_DIR} MATCHES "dldt")
set(OPENVINO_CPU_EXTENSION_DIR $ENV{INTEL_CVSDK_DIR}/deployment_tools/inference_engine/lib/intel64)
else()
set(OPENVINO_CPU_EXTENSION_DIR ${onnxruntime_BINARY_DIR}/ie_cpu_extension)
target_compile_options(ie_cpu_extension PRIVATE -Wno-unused-parameter)
endif()
target_link_libraries(onnxruntime_providers_openvino PRIVATE -linference_engine IE::ie_cpu_extension -ltbb ${PYTHON_LIBRARIES})
file(COPY ${onnxruntime_providers_openvino_py_srcs} DESTINATION ${onnxruntime_BINARY_DIR})
endif()
endif()
if ($ENV{INTEL_CVSDK_DIR} MATCHES "2018.5")
link_directories(onnxruntime_providers_openvino ${OPENVINO_LIB_DIR})
target_link_libraries(onnxruntime_providers_openvino -linference_engine ${PYTHON_LIBRARIES})
endif()
file(COPY ${onnxruntime_providers_openvino_py_srcs} DESTINATION ${onnxruntime_BINARY_DIR})
endif()

Expand Down
9 changes: 9 additions & 0 deletions cmake/onnxruntime_python.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -222,6 +222,15 @@ if (onnxruntime_USE_NGRAPH)
)
endif()

if (onnxruntime_USE_OPENVINO)
add_custom_command(
TARGET onnxruntime_pybind11_state POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${OPENVINO_CPU_EXTENSION_DIR}/${OPENVINO_CPU_EXTENSION_LIB}
$<TARGET_FILE_DIR:${test_data_target}>/onnxruntime/capi/
)
endif()

if (onnxruntime_USE_TVM)
add_custom_command(
TARGET onnxruntime_pybind11_state POST_BUILD
Expand Down
8 changes: 8 additions & 0 deletions cmake/onnxruntime_unittests.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -452,6 +452,14 @@ if(WIN32)
$<TARGET_FILE_DIR:${test_data_target}>
)
endif()
if (onnxruntime_USE_OPENVINO)
add_custom_command(
TARGET ${test_data_target} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${OPENVINO_CPU_EXTENSION_DIR}/${OPENVINO_CPU_EXTENSION_LIB}
$<TARGET_FILE_DIR:${test_data_target}>
)
endif()
if (onnxruntime_USE_NGRAPH)
add_custom_command(
TARGET ${test_data_target} POST_BUILD
Expand Down
8 changes: 5 additions & 3 deletions dockerfiles/Dockerfile.openvino
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,9 @@ RUN tar -xzf l_openvino_toolkit*.tgz && \
pip install networkx==2.3 test-generator==0.1.1 defusedxml>=0.5.0

ENV LD_LIBRARY_PATH=/opt/miniconda/lib:/usr/lib:/usr/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH
ENV INTEL_OPENVINO_DIR=/opt/intel/openvino_2019.1.144
ENV INTEL_CVSDK_DIR=/opt/intel/openvino_2019.1.144
ENV INTEL_OPENVINO_DIR=/opt/intel/openvino_2019.3.376
ENV INTEL_CVSDK_DIR=/opt/intel/openvino_2019.3.376
ENV InferenceEngine_DIR=${INTEL_CVSDK_DIR}/deployment_tools/inference_engine/share
ENV IE_PLUGINS_PATH=${INTEL_CVSDK_DIR}/deployment_tools/inference_engine/lib/intel64
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/cldnn/lib:${INTEL_OPENVINO_DIR}/inference_engine/external/gna/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/mkltiny_lnx/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/omp/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
ENV OpenCV_DIR=${INTEL_OPENVINO_DIR}/opencv/share/OpenCV
Expand All @@ -49,7 +50,8 @@ ENV PATH=${INTEL_CVSDK_DIR}/deployment_tools/model_optimizer:$PATH
ENV PYTHONPATH=${INTEL_CVSDK_DIR}/deployment_tools/model_optimizer:$PYTHONPATH
ENV HDDL_INSTALL_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl/lib:$LD_LIBRARY_PATH

RUN locale-gen en_US.UTF-8 && update-locale LANG=en_US.UTF-8
ENV LANG en_US.UTF-8

RUN cd onnxruntime && ./build.sh --config RelWithDebInfo --update --build --parallel --use_openvino $DEVICE --build_wheel && \
pip install build/Linux/RelWithDebInfo/dist/*-linux_x86_64.whl && rm -rf /code/onnxruntime /code/cmake-3.14.3-Linux-x86_64
7 changes: 5 additions & 2 deletions docs/execution_providers/OpenVINO-ExecutionProvider.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,12 @@ VPUs as well as Intel<sup>®</sup> Vision accelerator Design with Intel Movidiu
| BatchNormalization | Scaleshift (can be fused into Convlution or Fully Connected) | Yes | Yes | Yes | Yes
| Concat | Concat | Yes | Yes | Yes | Yes
| Conv | Convolution | Yes | Yes | Yes | Yes
| Div | Eltwise(operation = mul)->Power | Yes | Yes | Yes | Yes
| Dropout | Ignored | Yes | Yes | Yes | Yes
| Flatten | Reshape | Yes | Yes | Yes | No
| Gemm | FullyConnected | Yes | Yes | Yes | Yes
| GlobalAveragePool | Pooling | Yes | Yes | Yes | Yes
| GlobalMaxPool | Pooling | Yes | Yes | Yes | Yes
| Identity | Ignored | Yes | Yes | Yes | Yes
| ImageScaler | ScaleShift | Yes | Yes | Yes | Yes
| LRN | Norm | Yes | Yes | Yes | Yes
Expand All @@ -32,6 +34,7 @@ VPUs as well as Intel<sup>®</sup> Vision accelerator Design with Intel Movidiu
| Reshape | Reshape | Yes | Yes | Yes | No
| Softmax | SoftMax | Yes | Yes | Yes | No
| Sum | Eltwise(operation=sum) | Yes | Yes | Yes | Yes
| Sub | Power->Eltwise(operation = sum)| Yes | Yes | Yes | Yes
| Transpose | Permute | Yes | Yes | Yes | No
| UnSqueeze | Reshape | Yes | Yes | Yes | No
| LeakyRelu | ReLU | Yes | Yes | Yes | Yes
Expand Down Expand Up @@ -74,11 +77,11 @@ Below topologies are supported from ONNX open model zoo using OpenVINO Execution

### Image Recognition Networks


| **Topology** | **CPU** | **GPU** | **VPU** | **FPGA** |
| --- | --- | --- | --- | --- |
| MNIST | Yes | Yes | Yes** | Yes***
| MNIST | Yes | Yes | Yes | Yes***

**Inception_v1 and MNIST are supported in OpenVINO R1.1 and are not supported in OpenVINO R5.0.1.

### Object Detection Networks

Expand Down
25 changes: 21 additions & 4 deletions onnxruntime/core/providers/openvino/openvino_execution_provider.cc
Original file line number Diff line number Diff line change
Expand Up @@ -239,10 +239,9 @@ void CheckGraphSupported(const onnxruntime::GraphViewer& graph_viewer, std::stri

//Zero dimension check
for (size_t i = 0; i < node_inputs.size(); i++) {

auto name = node_inputs[i]->Name();
auto it = initializers.find(name);
if(it == initializers.end() && node_inputs[i]->Shape() != nullptr){
if (it == initializers.end() && node_inputs[i]->Shape() != nullptr) {
if (node_inputs[i]->Shape()->dim_size() == 0) {
throw "Node_input is zero dimension";
}
Expand Down Expand Up @@ -297,6 +296,22 @@ void CheckGraphSupported(const onnxruntime::GraphViewer& graph_viewer, std::stri
}
}

if (node->OpType() == "Mul" || node->OpType() == "Add" || node->OpType() == "Div" || node->OpType() == "Sub") {
for (size_t i = 0; i < node->InputDefs().size(); i++) {
if (node->InputDefs()[i]->TypeAsProto()->tensor_type().elem_type() == ONNX_NAMESPACE::TensorProto_DataType::TensorProto_DataType_INT64) {
throw "int64 inputs not supported";
}
}
}

if (node->OpType() == "Div") {
for (size_t i = 0; i < node->InputDefs().size(); i++) {
if (node->InputDefs()[i]->TypeAsProto()->tensor_type().elem_type() == ONNX_NAMESPACE::TensorProto_DataType::TensorProto_DataType_INT32) {
throw "int32 inputs not supported for Div";
}
}
}

//MatMul is only supported if it is followed by Add
if (node->OpType() == "MatMul") {
for (size_t i = 0; i < node->InputDefs().size(); i++) {
Expand Down Expand Up @@ -327,13 +342,13 @@ void CheckGraphSupported(const onnxruntime::GraphViewer& graph_viewer, std::stri
}

//Dropout , Identity and Concat can't have graph inputs
if (node->OpType() == "Dropout" || node->OpType() == "Identity" || node->OpType() == "Concat") {
if (node->OpType() == "Dropout" || node->OpType() == "Identity" || node->OpType() == "Concat" || node->OpType() == "Gemm") {
auto graph_inputs = graph_viewer.GetInputs();
for (const auto& input : node->InputDefs()) {
auto it = find(graph_inputs.begin(), graph_inputs.end(), input);
if (it != graph_inputs.end()) {
{
throw "Dropout, Identity and Concat can't have graph inputs";
throw "Dropout, Identity, Concat, and Gemm can't have graph inputs";
}
}
}
Expand Down Expand Up @@ -549,6 +564,8 @@ std::vector<std::unique_ptr<ComputeCapability>> OpenVINOExecutionProvider::GetCa
sub_graph->SetMetaDef(meta_def);
result.push_back(onnxruntime::make_unique<ComputeCapability>(std::move(sub_graph)));

LOGS_DEFAULT(INFO) << openvino_ep::OpenVINOGraph::log_tag << "Returning result of GetCapability Function";

return result;
}

Expand Down
Loading

0 comments on commit 1510757

Please sign in to comment.