我有一个训练有素的 onnx 模型,我想将它整合到一个 android 应用程序中。我实际上正在做一个 uni 项目,结合 ML 和 Android 开发。
经过长时间的研究,由于我不想使用 python 私有 REST API,我得出的结论是有两种方法可以从这里继续:我可以尝试将我的 onnx 模型转换为 TF 模型,然后生成通过 TFLite Converter API 构建 TFLite 模型,或者使用onnxruntime尝试一下。
我尝试了 TFLite 的第一种方法,使用了这篇文章的答案,因此代码如下:
import onnx
from onnx_tf.backend import prepare
onnx_model = onnx.load("input_path") # load onnx model
tf_rep = prepare(onnx_model) # prepare tf representation
tf_rep.export_graph("output_path") # export the model
但我被困在从 .onnx 到 .pb 的第一次转换中,因为我认为 onnx-tf 不支持动态尺寸(我的模型有)。我不断收到
"Input size (depth of inputs) must be accessible via shape inference,"
或
RuntimeError: Node name is not unique in your model. Please recreate your model with unique node name.
类似的错误。
我也尝试了onnxruntime,但我似乎无法设法“为具有 NNAPI 支持的 Android 创建最小构建”。我在构建时遇到此错误:
[1/67] Building CXX object CMakeFiles/libprotobuf.dir/C_/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc.obj
FAILED: CMakeFiles/libprotobuf.dir/C_/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc.obj
C:\PROGRA~2\CODEBL~1\MinGW\bin\C__~1.EXE -DGOOGLE_PROTOBUF_CMAKE_BUILD -DHAVE_PTHREAD -I. -IC:/Users/chris/onnxruntime/cmake/external/protobuf/src -std=c++11 -MD -MT CMakeFiles/libprotobuf.dir/C_/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc.obj -MF CMakeFiles\libprotobuf.dir\C_\Users\chris\onnxruntime\cmake\external\protobuf\src\google\protobuf\io\io_win32.cc.obj.d -o CMakeFiles/libprotobuf.dir/C_/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc.obj -c C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc
C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc: In function 'int google::protobuf::io::win32::stat(const char*, google::protobuf::io::win32::_stat*)':
C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc:315:40: error: cannot convert 'google::protobuf::io::win32::_stat*' to '_stat*' for argument '2' to 'int _wstat(const wchar_t*, _stat*)'
return ::_wstat(wpath.c_str(), buffer);
^
In file included from C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc:52:0:
C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.h:75:51: note: class type 'google::protobuf::io::win32::_stat' is incomplete
PROTOBUF_EXPORT int stat(const char* path, struct _stat* buffer);
^
C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc: In function 'FILE* google::protobuf::io::win32::fopen(const char*, const char*)':
C:/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/io/io_win32.cc:337:10: error: '::_wfopen' has not been declared
return ::_wfopen(wpath.c_str(), wmode.c_str());
^
[6/67] Building CXX object CMakeFiles/libprotobuf.dir/C_/Users/chris/onnxruntime/cmake/external/protobuf/src/google/protobuf/message_lite.cc.obj
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "C:\Users\chris\onnxruntime\\tools\ci_build\build.py", line 2023, in <module>
sys.exit(main())
File "C:\Users\chris\onnxruntime\\tools\ci_build\build.py", line 1918, in main
cmake_path, source_dir, build_dir, args)
File "C:\Users\chris\onnxruntime\\tools\ci_build\build.py", line 1673, in build_protoc_for_host
run_subprocess(cmd_args)
File "C:\Users\chris\onnxruntime\\tools\ci_build\build.py", line 544, in run_subprocess
return run(*args, cwd=cwd, capture_stdout=capture_stdout, shell=shell, env=my_env)
File "C:\Users\chris\onnxruntime\tools\python\util\run.py", line 44, in run
env=env, shell=shell)
File "C:\Users\chris\AppData\Local\Programs\Python\Python37\lib\subprocess.py", line 512, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['C:\\Program Files\\CMake\\bin\\cmake.EXE', '--build', 'C:\\Users\\chris\\onnxruntime\\\\build\\Windows\\host_protoc', '--config', 'Release', '--target', 'protoc']' returned non-zero exit status 1.
我是在一条完全错误的道路上吗?这是我第一次尝试将 ML 与 Android 结合起来,所以我没有这方面的经验。任何建议都会非常受欢迎。