9 Replies Latest reply on Sep 7, 2017 2:14 PM by kazoo_kmt

    How to get the same RGB images from the two different RealSense attached to the same location?


      I'd like to get the same RGB images when I switch the RealSense to another. In other words, the RealSense is attached to the particular position such as the tip of the end effector or the top of the robot station at first. Then, it is removed and the second RealSense is attached to the same location. For now, I suppose that the extrinsic parameters for both cameras are same. I'm using two Realsense (SR300).


      If I don't do any image processing, the images from the first and the second cameras are slightly different. Even after undistorting both images by the intrinsic calibration, they shouldn't be same. What is the right way to transform the picture from the second RealSense so that it looks as same as the image from the first RealSense? I don't too much care if the pictures don't match at the peripheral area, but I want to match two images near the center.


      I'm thinking to use the intrinsic parameters for both RealSense: getting the image from the second sensor, multiplying the inverse matrix of the second sensor's intrinsics, then multiplying the intrinsic matrix for the first sensor. Does this work correctly?