Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Control depth color stream distance #2071

Closed
ranjitkathiriya opened this issue Sep 9, 2021 · 7 comments
Closed

Control depth color stream distance #2071

ranjitkathiriya opened this issue Sep 9, 2021 · 7 comments
Labels

Comments

@ranjitkathiriya
Copy link

ranjitkathiriya commented Sep 9, 2021

Currently, I am able to git Red and yellow colors on my nearest object, It is possible in ros to extend this color to more distance? around 2 meters because I am not able to capture last location correctly so I want this type of thing. Is it possible?

Screenshot from 2021-09-09 10-30-56

@MartyG-RealSense
Copy link
Collaborator

Hi @ranjitkathiriya Because the near-range detail at the sides of the image is blue, that makes me think that the center detail may be returning incorrect depth values, and so the colorization over distance does not need to be adjusted. Would it be possible please to provide an RGB image of the scene that the camera is observing to see if the observed object has features that may be causing bad depth readings, please?

@ranjitkathiriya
Copy link
Author

ranjitkathiriya commented Sep 9, 2021

Screenshot from 2021-09-09 11-13-55

Actually, these are my points, and from the color depth stream, I am finding the center of the cow teat. Because of the green color, I can find far teat, but if the color had been red or yellow up to a certain distance, it would have been greater.
Thanks.

Currently, this combination with depth color stream is giving 92% accuracy while attaching cup from the robotic arm, but to increase it to more I want to control depth color with respect to a certain distance, is it possible?

From the camera the rare cow teat and after that both(Other cow body), has the same color green and I want to make a bit adjustment over there to increase my accuracy to a higher number.

@MartyG-RealSense
Copy link
Collaborator

Given how well lit the scene is, you may be able to achieve improved accuracy if you turn off the IR emitter and the dot pattern that is being projected onto the udders and the underbelly section nearest to the camera. When a scene is well lit, the camera can use the ambient light that is present to analyze surfaces for depth detail instead of analyzing the projected dots cast onto the surface of the object.

<rosparam>
/camera/stereo_module/emitter_enabled: false
</rosparam>

In the librealsense SDK the color grading over distance scale can be altered by changing the depth unit scale value of the camera. Changing the scale from its default of 0.001 to a value of 0.0001 is a useful method for inproving the depth image when performing close-range sensing. This is illustrated in IntelRealSense/librealsense#8228 but Doronhi the RealSense ROS wrapper developer explains in #277 (comment) that the depth unit scale cannot be changed in the ROS wrapper.

@ranjitkathiriya
Copy link
Author

Given how well lit the scene is, you may be able to achieve improved accuracy if you turn off the IR emitter and the dot pattern that is being projected onto the udders

Based on your point, Can you please explain to me if I turn this thing off I will get point cloud or not? Because For identifying 3D coordinates I do require a point cloud.

@MartyG-RealSense
Copy link
Collaborator

Yes, you can obtain a point cloud if the IR emitter is turned off. It is a separate component inside the camera from the sensors that projects a semi-random dot pattern onto surfaces in the scene. It is useful for analyzing surfaces with no texture or low texture detail, as the camera can use the dots as a texture source to analyze for depth. In a dimly lit scene, turning off the pattern may significantly reduce the detail on the depth image. If the surfaces are well lit though, like in your image, you may not need the dot pattern.

Point 5 of the section of Intel's camera tuning guide linked to below estimates that RMS error (error over distance) can be reduced by around 30% if the dot pattern is turned off.

https://dev.intelrealsense.com/docs/tuning-depth-cameras-for-best-performance#section-verify-performance-regularly-on-a-flat-wall-or-target

@ranjitkathiriya
Copy link
Author

Thanks for providing solution,

<rosparam>
/camera/stereo_module/emitter_enabled: false
</rosparam>

How can I off this can you please! explain about that?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 9, 2021

Do you mean how you can implement it? If so then you would edit the code into your launch file by opening the launch file with a text editor, such as Ubuntu's Gedit, and insert the code.

If you wanted to test turning the IR emitter off with dynamic_reconfigure without having to edit the test file then I believe that the command to input into the ROS terminal after launch has completed would be:

rosrun dynamic_reconfigure dynparam set /camera/stereo_module emitter_enabled 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants