You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Provided depth images "aligned_depth_to_color_xxxxxx.png" is float32 type. Is it possible to obtain absolute depth values in the camera coordinate frame?
The values of the in-depth image range from 0.0 to 1.0. Are they normalized? How to unnormalize to obtain the absolute depth values for each pixel?
Thanks in advance!
The text was updated successfully, but these errors were encountered:
You can look at the examples in the Visualizing Sequences section, where we convert the depth images into point clouds. See here and here for getting the depth values in meters in the camera frame.
As mentioned above, depth values are in meters, not normalized.
Thanks, @ychao-nvidia for the response. I looked in the visualization code
depth = cv2.imread(depth_file, cv2.IMREAD_ANYDEPTH) --> gives depth values in 0-65535 range (uint16) and,
depth = depth.astype(np.float32) / 1000. --> gives depth values in 0-65.535 range (float32). Since you are dividing by 1000, this indicates that the raw depth is in mm(millimetres).
After performing the above operation, if I compute average depth values over all pixels, I get around 2.97 meters and a max of 65.35 meters. This seems unrealistic for the sequence.
All your interpretation is correct. Depth from RealSense could be noisy and unreliable when it is beyond certain distance, so I won't trust the depth readings there. 65535 (mm) is just the upper bound of depth.
Hello,
Provided depth images "aligned_depth_to_color_xxxxxx.png" is float32 type. Is it possible to obtain absolute depth values in the camera coordinate frame?
The values of the in-depth image range from 0.0 to 1.0. Are they normalized? How to unnormalize to obtain the absolute depth values for each pixel?
Thanks in advance!
The text was updated successfully, but these errors were encountered: