2

我在我的项目中使用工作演示中的 OpenCV 框架缺少一些小东西。

重现步骤:

  1. 从http://aptogo.co.uk/2011/09/opencv-framework-for-ios/下载示例应用程序

  2. 使用 Titan create --platform=iphone --type=module --dir= 创建一个新的 Titainum iOS 模块。--name=opencv --id=opencv

  3. 打开 XCode 项目,从 FaceTracker 应用程序和其他所需的框架中拖入 OpenCV 框架。

  4. 将 OTHER_LDFLAGS=$(inherited) -framework OpenCV 添加到 module.xcconfig

  5. 创建名为 OpencvView 和 OpencvViewProxy 的新 TiUIView 和 TiUIViewProxy 类。

  6. 在新的 OpencvView 类中,实例化一个使用 OpenCV 的 UIViewController。

该构建将构建 Titanium 模块,但是当我尝试运行模块测试工具时,我收到 OpenCV 对象的以下错误:

架构 i386 的未定义符号:“_CMSampleBufferGetImageBuffer”,引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController.o) “_CMSampleBufferGetOutputPresentationTimeStamp”,引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] libopencv.a(VideoCaptureViewController.o) "_CMTimeMake",引用自:-[VideoCaptureViewController createCaptureSessionForCamera:qualityPreset:grayscale:] in libopencv.a(VideoCaptureViewController.o) "_CVPixelBufferGetBaseAddress",引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection: ] 在 libopencv.a(VideoCaptureViewController.o) "_CVPixelBufferGetBaseAddressOfPlane",引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController.o) "_CVPixelBufferGetHeight",引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController.o ) "_CVPixelBufferGetPixelFormatType",引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController.o) "_CVPixelBufferGetWidth",引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController .o) “_CVPixelBufferLockBaseAddress”,引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController.o) "_CVPixelBufferUnlockBaseAddress",引用自: -[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:] in libopencv.a(VideoCaptureViewController.o) "cv::_InputArray ::_InputArray(cv::Mat const&)",引用自:-[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:] in libopencv.a(DemoVideoCaptureViewController.o) "cv::_OutputArray::_OutputArray(cv::Mat&)",引用自:-[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:] in libopencv.a(DemoVideoCaptureViewController.o) "cv::CascadeClassifier::load(std::string const&)",引用自:-[DemoVideoCaptureViewController viewDidLoad] in libopencv.a(DemoVideoCaptureViewController.o) "cv::CascadeClassifier::CascadeClassifier()",引用自: -[DemoVideoCaptureViewController .cxx_construct] in libopencv.a(DemoVideoCaptureViewController.o) "cv::CascadeClassifier: :~CascadeClassifier()",引用自:-[DemoVideoCaptureViewController .cxx_destruct] in libopencv.a(DemoVideoCaptureViewController.o) "cv::Mat::deallocate()",引用自:-[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]在 libopencv.a(VideoCaptureViewController.o) -[UIImage(UIImage_OpenCV) CVMat] 在 libopencv.a(UIImage+OpenCV.o) -[UIImage(UIImage_OpenCV) CVGrayscaleMat] 在 libopencv.a(UIImage+OpenCV.o) "cv: :Mat::create(int, int const*, int)",引用自:-[UIImage(UIImage_OpenCV) CVMat] in libopencv.a(UIImage+OpenCV.o) -[UIImage(UIImage_OpenCV) CVGrayscaleMat] in libopencv.a(UIImage+OpenCV.o) "cv::flip(cv:: _InputArray const&, cv::_OutputArray const&, int)",引用自:-[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:] in libopencv.a(DemoVideoCaptureViewController.o) "cv::resize(cv::_InputArray const&, cv::cv::resize(cv::_InputArray const&, cv::cv::resize(cv::_InputArray const&, cv::输出数组 const&, cv::Size, double, double, int)",引用自:-[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:] in libopencv.a(DemoVideoCaptureViewController.o) "cv::fastFree(void*)",引用自:-[VideoCaptureViewController captureOutput: didOutputSampleBuffer:fromConnection:] 在 libopencv.a(VideoCaptureViewController.o) -[UIImage(UIImage_OpenCV) CVMat] 在 libopencv.a(UIImage+OpenCV.o) -[UIImage(UIImage_OpenCV) CVGrayscaleMat] 在 libopencv.a(UIImage+OpenCV. o) "cv::transpose(cv::_InputArray const&, cv::_OutputArray const&)",引用自:-[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:] in libopencv.a(DemoVideoCaptureViewController.o) "_kCVPixelBufferPixelFormatTypeKey",引用自:-[VideoCaptureViewController createCaptureSessionForCamera:qualityPreset:grayscale:] in libopencv.a(VideoCaptureViewController.o) ld: 未找到架构 i386 clang 的符号:错误:链接器命令失败,退出代码为 1(使用 -v 查看调用)

4

2 回答 2

1

我有类似的错误。只是改变“构建设置”并没有帮助。

最后,我通过添加一些框架(如 coreMedia、coreVideo 等)解决了这个问题。但是,这些框架并没有在我的代码中使用。

所以我猜openCV需要这些框架。但我不知道为什么。

希望这会有所帮助:)

于 2013-09-17T13:32:13.037 回答
0

Xcode 4.5.1 上的默认苹果 llvm 编译器也有类似的问题。尝试将其更改为 gcc(从您的构建选项),看看这是否有效。

于 2012-12-14T18:34:34.037 回答