我正在尝试使用 openCV 找到 findFundamentalMat() 来获取两个匹配点集的基本矩阵。图像失真,然后我检测关键点并匹配它们。我认为使用 undistortPoints 可以为基本矩阵提供更好的结果(我知道相机的内在参数),但是在 undistortPoints 之后,findFundamentalMat 给出了奇怪的结果。首先,在生成的掩码数组中,所有点都被视为内点。二是误差很大。我这样计算错误:
vector<Point2f> points1Raw; //Raw points from Keypoints
vector<Point2f> points1; //Undistorted points
vector<Point2f> points2Raw;
vector<Point2f> points2;
for(int k=0; k<matches.size(); k++) {
points1Raw.push_back(keypoints1[matches[k].queryIdx].pt);
points2Raw.push_back(keypoints2[matches[k].trainIdx].pt);
};
undistortPoints(points1Raw, points1, cameraMatrixm, distCoeffsm);
undistortPoints(points2Raw, points2, cameraMatrixm, distCoeffsm);
vector<uchar> states;
Mat f = findFundamentalMat(points1, points2, FM_RANSAC, 3, 0.99, states);
//For all k matches
Mat p1(3, 1, CV_64F);
p1.at<double>(0, 0) = points1[k].x;
p1.at<double>(1, 0) = points1[k].y;
p1.at<double>(2, 0) = 1;
Mat p2(1, 3, CV_64F);
p2.at<double>(0, 0) = points2[k].x;
p2.at<double>(0, 1) = points2[k].y;
p2.at<double>(0, 2) = 1;
Mat res = abs(p2 * f * p1); // f computed matrix
if((bool)states[k]) //if match considered inlier (in my strange case all)
err = err + res.at<double>(0, 0); //accumulate errors
产生的总误差大约是 100 到 1000 甚至更多。但是通过在计算基本矩阵之前手动检查匹配,它们中的大多数看起来都是正确的。我究竟做错了什么?:/