我有一个立体相机,上面有两个我在 Matlab 中使用的网络摄像头。我校准了相机,并得到了stereoParams。
然后,我希望用户能够选择图片中的一个点,并获得图像中真实世界的点位置。我知道为此我需要基线、焦距和像素差异。我有像素差异,但我如何获得基线和焦距?可以从 stereoParams 计算基线吗?
我有一个立体相机,上面有两个我在 Matlab 中使用的网络摄像头。我校准了相机,并得到了stereoParams。
然后,我希望用户能够选择图片中的一个点,并获得图像中真实世界的点位置。我知道为此我需要基线、焦距和像素差异。我有像素差异,但我如何获得基线和焦距?可以从 stereoParams 计算基线吗?
我不熟悉 Matlab 立体相机校准功能,但一般来说,一旦你校准了每个相机,并找到基本矩阵,你应该能够做到以下几点:
The "pixel" disparity is defined in rectified image coordinates. However, as your real cameras will not normally be exactly parallel and row-aligned, there is a non-identity transformation that rectifies your input camera images. Therefore you need to "undo" the rectification in order to find the pixel in the other image corresponding to a given one. The procedure is as follows:
Note that all these operations need be performed once only for each pixel, and can be cached. In other words, you can pre-compute a "rectified" 2-channel disparity map that for each pixel yields an offset from its coordinates in one image to the corresponding pixel in the other image. The map itself can be stored as an image, whose channel type depends on the disparity range - usually short integer will be enough, as it can represent offsets of +- 32K pixels.
您可以使用reconstructScene函数,该函数将为您提供具有有效视差的每个像素的3D 世界坐标。在此示例中,您将查找检测到的人的质心的 3D 坐标。