- opencv:4.1.0(带“contrib”扩展)
- 迅速:5
- IOS:12.2
我正在尝试在cv::aruco::detectMarkers
iPhone 相机的每一帧上运行 opencv 的方法。这可行,但大约一分钟后它会因错误而崩溃:Thread 8: EXC_BAD_ACCESS (code=1, address=0x10dea0000)
我已经包含了我认为是应用程序中最相关的两个部分,UIViewController
即包装器和Objective-C
包装器,并且我在每一行中都用注释标记了抛出异常的两行。
在我看来,这不是一个并发问题,因为它应该在主线程上同步运行。
这是结果thread backtrace
* thread #8, queue = 'com.apple.root.default-qos', stop reason = EXC_BAD_ACCESS (code=1, address=0x10dea0000)
* frame #0: 0x000000010505c700 Camera`cv::pointSetBoundingRect(cv::Mat const&) + 432
frame #1: 0x000000010505c8c0 Camera`cvBoundingRect + 236
frame #2: 0x0000000104fdf168 Camera`cvFindNextContour + 4348
frame #3: 0x0000000104fe00fc Camera`cvFindContours_Impl(void*, CvMemStorage*, CvSeq**, int, int, int, CvPoint, int) + 1008
frame #4: 0x0000000104fe118c Camera`cv::findContours(cv::_InputArray const&, cv::_OutputArray const&, cv::_OutputArray const&, int, int, cv::Point_<int>) + 972
frame #5: 0x0000000104fe1bb0 Camera`cv::findContours(cv::_InputArray const&, cv::_OutputArray const&, int, int, cv::Point_<int>) + 96
frame #6: 0x000000010507df68 Camera`cv::aruco::DetectInitialCandidatesParallel::operator()(cv::Range const&) const + 2056
frame #7: 0x0000000104f8e068 Camera`(anonymous namespace)::ParallelLoopBodyWrapper::operator()(cv::Range const&) const + 248
frame #8: 0x0000000104f8df5c Camera`(anonymous namespace)::block_function(void*, unsigned long) + 32
frame #9: 0x0000000105318824 libdispatch.dylib`_dispatch_client_callout2 + 20
这就是我设置 的方式AVCaptureVideoDataOutputSampleBufferDelegate
,它接收每一帧作为CMSampleBuffer
,将其转换为 UIImage 并将 UIImage 发送到 opencv 以进行 Aruco 标记检测。
extension ViewController : AVCaptureVideoDataOutputSampleBufferDelegate {
func captureOutput(
_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
let image : UIImage = self.sample_buffer_to_uiimage(sampleBuffer: sampleBuffer)
// call out to opencv wrapper, which eventually blows up
let annotated_image : UIImage = OpenCVWrapper.drawMarkers(image)
self.imageView.image = annotated_image
}
func sample_buffer_to_uiimage(sampleBuffer:CMSampleBuffer) -> UIImage
{
let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let cimage : CIImage = CIImage(cvPixelBuffer: imageBuffer)
let context:CIContext = CIContext.init(options: nil)
let cgImage:CGImage = context.createCGImage(cimage, from: cimage.extent)!
let image:UIImage = UIImage.init(cgImage: cgImage)
return image
}
}
这就是我设置objective-c opencv包装方法的方式
+(UIImage *) drawMarkers:(UIImage *)image {
cv::Mat colorImageRGBA;
cv::Mat colorImage;
cv::Mat grayImage;
UIImageToMat(image, colorImageRGBA);
cvtColor(colorImageRGBA, grayImage, cv::COLOR_BGR2GRAY);
cvtColor(colorImageRGBA, colorImage, cv::COLOR_RGBA2RGB);
cv::Ptr<cv::aruco::Dictionary> dictionary = cv::aruco::getPredefinedDictionary(cv::aruco::DICT_6X6_250);
std::vector<int> markerIds;
std::vector<std::vector<cv::Point2f>> markerCorners;
// this is the line that blows up
cv::aruco::detectMarkers(grayImage, dictionary, markerCorners, markerIds);
if (markerIds.size() > 0) {
cv::aruco::drawDetectedMarkers(colorImage, markerCorners, markerIds);
}
return MatToUIImage(colorImage);
}