2 Replies Latest reply on Apr 26, 2018 3:40 AM by Wiredchop

    Intrinsic camera parameters mismatch


      Hello all,


      I'm programming in C#, and capturing the depth, colour and pointcloud information from a realsense D435 camera. I'm projecting the depth data into 3D space using a standard function which requires the intrinsic camera parameters of the camera.


      I was getting the intrinsics of the depth camera and noticed that the distortion coefficients were zero (as others have mentioned). Given the wide angle of the camera I thought this was unlikely but perhaps the factory calibration didn't include distortion...

      To improve this I went about looking at custom calibrations so I could get a better lens model. When I was working through the instructions I used the command line tool which can read and write the parameters.


      I wrote the intrinsic parameters using the command:


      Intel.Realsense.CustomRW.exe -r > params.txt


      I got the paramers:



      I was suprised to see that the left and right infrared cameras have distortion coefficients! It then occured to me that perhaps I should be using the intrinsice from the infrared stream rather than the depth stream. I did that and got:




      This is definitely the IR camera as there is no depth stream at full resolution (1280 x 800 matches the resolution given by the calibration software).

      Thing to note:


      • Still no distortion coefficients!
      • focal length and principal points don't match!


      A few questions:

      • Is this a software bug?
      • Are the parameters different because the image has been manipulated in the hardware?
      • Given all this, what parameters should I use to project the point cloud?
      • Does a different implementation of the SDK return distortion coefficients?


      If anyone knows the answer to this, or has solved a similar problem I'd be really grateful. I'm not sure what to trust!