I am working on KITTI data set i am taking 2 images and finding the disparity to get 3D point cloud .The problem which i am facing is that i am not able to get a good disparity map.Most of the disparity value is less than 0.1 .The disparity values are between 0 to 1 (do i need to scale them). The parameters of my stereo are listed below
cv::StereoBM sbm;
sbm.state->SADWindowSize = 9;
sbm.state->numberOfDisparities = 112;
sbm.state->preFilterSize = 5;
sbm.state->preFilterCap = 1;
sbm.state->minDisparity = 0;
sbm.state->textureThreshold = 5;
sbm.state->uniquenessRatio = 5;
sbm.state->speckleWindowSize = 0;
sbm.state->speckleRange = 20;
sbm.state->disp12MaxDiff = 64;
sbm(leftimage, rightimage,disp);
normalize(disp, disp8, 0.1, 255, CV_MINMAX, CV_8U);
The disparity map you have "looks" good for Block Matching.
Block Matching is the most basic method to obtain disparity maps. It is a local method that computes the disparity estimate via a brute force search (modulo filtering from opencv). Hence its output is limited in accuracy and is typically noisy.
As others have mentioned you can adjust the window size to improve the results slightly, but this will not make the disparity significantly better.
Look at the stereo evaluation on the KITTI benchmark and select a more accurate algorithm if you have to. OpenCV has an implementation of SGM, which produces smoother disparities. The desired quality of disparity maps depends on your application. In some cases block matching is sufficient. For others, it may not be.
Remember, the definition of a disparity is: the difference between the x-coordinate of a pixel in the left image and its corresponding pixel in the right image. That is, the unit of disparity is "pixels".
Larger disparities, mean closer objects. When you scale the image for display, larger disparities appear brighter. For example, the sign on the road is closer to the camera and it appears brighter than pixels far away on the road.
Your disparity values are not supposed to be between 0 and 1. You are scaling the image for display as a uint8 which is ok for display, but not suitable to use the disparity for an actual measurement.
In OpenCV, the default behavior is to produce a disparity map as a signed short obtained by multiplying subpixel shifts with 16. To obtain the true disparity values, you divide opencv's output by 16 and convert to float.
You can do something like this:
cv::Mat<float> true_dmap = disp * (1.0 / 16.0f);
or
disp.convertTo(true_dmap, CV_32F, 1.0/16.0, 0.0);
Or, you can call reprojectImageTo3D to get a point cloud given the estimated disparity map and stereo calibration.
Note, if you attempt to display true_map via imshow, you will not see something meaningful.
Good luck,
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With