3

我正在尝试使用 openCV 找到 findFundamentalMat() 来获取两个匹配点集的基本矩阵。图像失真,然后我检测关键点并匹配它们。我认为使用 undistortPoints 可以为基本矩阵提供更好的结果(我知道相机的内在参数),但是在 undistortPoints 之后,findFundamentalMat 给出了奇怪的结果。首先,在生成的掩码数组中,所有点都被视为内点。二是误差很大。我这样计算错误:

        vector<Point2f> points1Raw; //Raw points from Keypoints
        vector<Point2f> points1; //Undistorted points
        vector<Point2f> points2Raw;
        vector<Point2f> points2;
        for(int k=0; k<matches.size(); k++) {
            points1Raw.push_back(keypoints1[matches[k].queryIdx].pt);
            points2Raw.push_back(keypoints2[matches[k].trainIdx].pt);
        };

        undistortPoints(points1Raw, points1, cameraMatrixm, distCoeffsm);
        undistortPoints(points2Raw, points2, cameraMatrixm, distCoeffsm);

        vector<uchar> states;

        Mat f = findFundamentalMat(points1, points2, FM_RANSAC, 3, 0.99, states);

            //For all k matches
            Mat p1(3, 1, CV_64F);
            p1.at<double>(0, 0) = points1[k].x;
            p1.at<double>(1, 0) = points1[k].y;
            p1.at<double>(2, 0) = 1;
            Mat p2(1, 3, CV_64F);
            p2.at<double>(0, 0) = points2[k].x;
            p2.at<double>(0, 1) = points2[k].y;
            p2.at<double>(0, 2) = 1;

            Mat res = abs(p2 * f * p1); // f computed matrix

            if((bool)states[k]) //if match considered inlier (in my strange case all)
                err = err + res.at<double>(0, 0); //accumulate errors

产生的总误差大约是 100 到 1000 甚至更多。但是通过在计算基本矩阵之前手动检查匹配,它们中的大多数看起来都是正确的。我究竟做错了什么?:/

4

2 回答 2

2

你有多少分?要算作内点,最多允许(在您的情况下)3 个像素的错误,......所以如果您有几百个点,很明显累积的错误很大。

于 2013-03-07T22:41:20.457 回答
0

在 undistortPoints 之后,坐标不再以像素为单位,因此3没有多大意义。您可以尝试使用:

0.006 * maxVal

如: http: //www.learningace.com/doc/568776/daa602b585fb296681f344b08bc808f0/snavely_ijcv07第 4.1 节

double minVal, maxVal;
cv::minMaxIdx(points1, &minVal, &maxVal);
于 2014-07-07T12:15:46.030 回答