Items with no label
3335 Discussions

experiment setup for good depth map

FDing3
Beginner
1,934 Views

Hi Guys,

I want to organize an experiment about counter-movement jump using D435 and Nuitrack skeleton tracking SDK. The subject will jump 2m away in front of the camera (see figure_1). My aim is to plot various joint angles and displacement using the xyz data at each joint during the jumping phase as accurate as possible. It obviously requires the accurate depth map.

I decide to use the down-sample & spatial filter to increase the depth map precision. However, after reading all white papers from Intel, I still have some questions in my mind:

1/ What's the ultimate aim for the spatial filter to achieve? My understanding is: Reduce RMS error while preserving the edge.

2/ As the paper said, the point cloud is the best way to view the spatial noise. I can use Realsense Labview code to view the point cloud of the subject in front of me. (see figure_2)

But what's your suggestion to find good set of parameters for the spatial/subsample filters according to the experiment environment using point cloud? In other words, what the good standard for the filtered depth map? Less RMS error and clear edge without any over smooth? Or......

3/ I read the paper which said the RGB color will help the depth calculation. What does this mean? In my understanding: If I want to improve the depth precision, I should wear red, blue or green color. And it's better to wear long trousers and long-sleeve shirt. Am I right?

4/ The paper said the RMS error will be smaller if we turn off the projector. How can I understand this point?

And if I follow this logic, can I improve the depth quality by using the outside projector which project semi-random dots on the subject body? Do I have to use the infrared dot? Where can I buy the special IR projector which can project IR dots? I can't find this kind of product on the market.

5/ I still remembered the reflective marker will reduce the quality of the depth map for Kinect depth map. There is hole at the marker. Does the Realsense has the similar problem around the reflective marker?

Actually, if we have to stick some reflective makers during the experiments, is there any method to mitigate this problem?

Thanks!

0 Kudos
6 Replies
idata
Employee
691 Views

Hi jakeding,

 

 

Thank you for your interest in the Intel RealSense D435 camera.

 

Please let me look into it and I will get back to you later.

 

 

Regards,

 

Alexandra
0 Kudos
FDing3
Beginner
691 Views

Thanks! Hope to hear from you soon.

0 Kudos
idata
Employee
691 Views

Hello jakeding,

 

 

We will answer your questions individually. It seems that you have you read the Depth Post-Processing paper from https://www.intel.com/content/www/us/en/support/articles/000028866/emerging-technologies/intel-realsense-technology.html https://www.intel.com/content/www/us/en/support/articles/000028866/emerging-technologies/intel-realsense-technology.html. We recommend to also read the BKMs for Tuning Whitepaper at https://www.intel.com/content/www/us/en/support/articles/000027833/emerging-technologies/intel-realsense-technology.html https://www.intel.com/content/www/us/en/support/articles/000027833/emerging-technologies/intel-realsense-technology.html. It may answer more of your questions and give you more guidance.

1. Yes, the spatial filter applies edge-preserving smoothing of depth data, while minimizing the RMS error.

2. Please review the BKMs for Tuning White paper for more guidance on using these filters.

3. RGB color is not currently incorporated into the post processing, so wearing red green and blue would not currently affect depth.

4. You can use visible or infrared projectors. So yes you can use a regular front projector that you can buy at Amazon, for example:

https://www.amazon.com/AAXA-M6-Projector-Built-Battery/dp/B06ZZ3MPR3/ref=sr_1_1?s=electronics&ie=UTF8&qid=1536685349&sr=1-1&keywords=aaxa+m6 https://www.amazon.com/AAXA-M6-Projector-Built-Battery/dp/B06ZZ3MPR3/ref=sr_1_1?s=electronics&ie=UTF8&qid=1536685349&sr=1-1&keywords=aaxa+m6

Or try to contact AMS about their different IR projectors.

5. To the extent that the reflective spot gives a hot spot that saturates the detector, then yes, these cameras will also give zero depth at that point. But in general you should be able to manage the exposure or laser power and avoid this.

 

 

Let me know if you have further questions.

 

 

Regards,

 

Alexandra
0 Kudos
FDing3
Beginner
691 Views

Hi Alexandra,

Your reply is very helpful! After reading all papers from Intel recommended by you,I still wanna discuss the following questions a little bit:

1. Since RGB color won't help depth measurement, do we need to care about the cloth color anyway in attempt to increase accuracy?

2. I am confused about the definition of "active stereo", could you look at the figure 1 in attachment and tell me which active stereo technology you are using for D4XX?

In other words, what's the use of the laser pattern in D4XX? Does it perform like structured pattern or just add texture to help two cameras distinguish different points in the space?

3. You said in last post: "To the extent that the reflective spot gives a hot spot that saturates the detector, then yes, these cameras will also give zero depth at that point. But in general you should be able to manage the exposure or laser power and avoid this."

I don't understand your explanation, could you explain the above words in detail? Thanks.

4. Can I put a squared box 2 meters away from the camera and then play with the spatial filter until it neither over-smooth nor leave too much noise in the point cloud?

I appreciate it!

Best,

Jake

0 Kudos
idata
Employee
691 Views

Hi jakeding,

 

 

Please find below the answers to each question.

 

 

1. The texture is more important than the color. For example is better to wear different colored clothes than a completely blue shirt.

 

 

2. In the D400 datasheet linked to below, Section 2.3 page 12 answers your question . The laser pattern from the IR projector adds texture to help the two cameras distinguish different points in space.

 

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf

 

 

3. A reflective spot may cause over-exposure on the sensors. It is the same as taking a picture of a bright light with a regular camera. When there is bright, uniform light on a reflective surface, such as glare from the sun or a reflective spot, depth cannot be detected because there is no texture. Remember the importance of texture. You can play with the exposure settings to reduce this effect.

 

 

4. Why would you not be able to do this?

 

 

Regards,

 

Alexandra
0 Kudos
FDing3
Beginner
691 Views

Thank you very much! You perfectly solved all my questions.

0 Kudos
Reply