我正在尝试使用 onnxruntime 在 Ubuntu 16.04 上的 ONNX 模型上运行推理。但是导入语句给了我这个错误:
>>> import onnxruntime
/opt/conda/lib/python3.6/site-packages/onnxruntime/capi/_pybind_state.py:13: UserWarning: Cannot load onnxruntime.capi. Error: '/opt/conda/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_pybind11_state.cpython-36m-x86_64-linux-gnu.so: cannot enable executable stack as shared object requires: Permission denied'
warnings.warn("Cannot load onnxruntime.capi. Error: '{0}'".format(str(e)))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/conda/lib/python3.6/site-packages/onnxruntime/__init__.py", line 12, in <module>
from onnxruntime.capi._pybind_state import get_all_providers, get_available_providers, get_device, set_seed, RunOptions, SessionOptions, set_default_logger_severity, NodeArg, ModelMetadata, GraphOptimizationLevel, ExecutionMode, OrtDevice, SessionIOBinding
ImportError: cannot import name 'get_all_providers'
在线搜索后,我尝试在警告中提到的 .so 上使用 execstack -c 。但我收到此消息,问题仍然存在:
section file offsets not monotonically increasing
我真的很感激一些解决这个问题的建议。
PS我什至尝试安装onnxruntime-gpu(我有CUDA 10.0)但我得到了同样的错误。