Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No pointcloud is published and RGB image is stuck. #1967

Closed
vincent51689453 opened this issue Jul 3, 2021 · 42 comments
Closed

No pointcloud is published and RGB image is stuck. #1967

vincent51689453 opened this issue Jul 3, 2021 · 42 comments
Labels

Comments

@vincent51689453
Copy link

Currently, I am using NVIDIA Jetson Xavier NX with JetPack 4.4. I can use realsense-viewer to capture color and depth image of D435i. However, when I want to publish realsense pointcloud and image in ROS Melodic. No point cloud can be displayed and the color image is stuck after several receives,
001
002
003

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jul 3, 2021

Hi @vincent51689453 Do you receive a point cloud if the method outlined in the pointcloud example of the ROS wrapper documentation is used, please - launching with roslaunch realsense2_camera rs_camera.launch filters:=pointcloud and setting Fixed Frame in RViz to camera_link

https://github.com/IntelRealSense/realsense-ros#point-cloud

@vincent51689453
Copy link
Author

vincent51689453 commented Jul 3, 2021

Yes. This is the way for me to publish the point cloud. However, nothing is displayed in RVIZ. Meanwhile, the RGB image stopped displaying after a while. But it is interesting that the depth image was normal!

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jul 3, 2021

Could you try the following steps please:

  1. In the PointCloud2 options of RViz, change 'Color Transformer' from "Intensity" to "RGB8" if you have the option available to do so in the Color Transformer settings.

#975 (comment)

  1. Set the style to points.

#1327 (comment)

@vincent51689453
Copy link
Author

When i am trying to change the color transformer. There is no option can be chosen.
004

@MartyG-RealSense
Copy link
Collaborator

There are a couple of other ROS cases open at the time of writing this that have the problem of no selectable color transformer options.

#1962 (comment)
#1938 (comment)

A common factor between these two cases is that they are on devices with Arm processors (Raspberry Pi 3B and Jetson Xavier NX). You too are using a Jetson Xavier NX board.

@doronhi the RealSense ROS wrapper developer advised that the lack of color images being received is responsible.

#1938 (comment)

Hi @doronhi Do you have any advice about these cases of Color Transformer options not being available, with a common factor being boards with an Arm processor, please?

@chrisdalke
Copy link

@MartyG-RealSense @vincent51689453

I've encountered this same issue with a D415 on a RPI 4; I'm able to view the depth image but the RGB image and color point clouds do not get published. Inspecting the /camera/color/image_raw and /camera/depth/color/points topics shows nothing being published on either topic.
This issue showed up recently, so I reverted to a manually-built earlier version of librealsense and realsense-ros which fixed the issue. Maybe there's a regression or incompatibility with the recent libraries on ARM?

I reverted to v2.41.0 of librealsense, and version 2.2.21 of realsense-ros and rebuilt from git on the Raspberry Pi. This combination does not exhibit this issue.

@MartyG-RealSense
Copy link
Collaborator

Thanks very much @chrisdalke for the feedback to @vincent51689453 about your own installation experience!

@MartyG-RealSense
Copy link
Collaborator

Hi @vincent51689453 Do you require further assistance with this case, please? Thanks!

@vincent51689453
Copy link
Author

Thankyou for answering my question.

@MartyG-RealSense
Copy link
Collaborator

Thanks very much @vincent51689453 for the update!

@JeremieBourque1
Copy link

I just wanted to share, for anyone else who has this problem, that the version that worked for me was v2.43.0 of librealsense built with CUDA and version 2.2.23 of the ROS wrapper.

It may work with more recent versions before 2.48.0 but I haven't tried them.

I am using a Jetson Xavier NX with jetpack 4.6

@MartyG-RealSense
Copy link
Collaborator

This issue has been reopened in order for Intel to evaluate reported FPS lag and missing color issues when the pointcloud filter is enabled on Nvidia Jetson boards. This case should be kept open whilst that review process is active.

@PaddyCube
Copy link

I just wanted to share, for anyone else who has this problem, that the version that worked for me was v2.43.0 of librealsense built with CUDA and version 2.2.23 of the ROS wrapper.

It may work with more recent versions before 2.48.0 but I haven't tried them.

I am using a Jetson Xavier NX with jetpack 4.6

How did you install this version? Did you build im from source or did you use a deb package? I'm struggling with the same issue on Jetson Nano with JetPack 4.6 and librealsense 2.50.0 as well as latest realsense-ros release

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Feb 1, 2022

Hi @PaddyCube I am not @JeremieBourque1 but you could build version 2.43.0 from source code by obtaining the source as a zip file from the 'Assets' file list at the bottom of the page linked to below.

https://github.com/IntelRealSense/librealsense/releases/tag/v2.43.0

IntelRealSense/librealsense#6964 (comment) provides instructions for an RSUSB backend build on Jetson from source code with CMake. This build method is not dependent on Linux versions or kernel versions and does not require patching.

@Samyak-ja-in
Copy link

I am using intel realsense d435i and rpi4 I was not getting pointcloud and it solved by setting these argument values in launch file

      <arg name="depth_width"       value="640"/>
      <arg name="depth_height"      value="480"/>
      <arg name="depth_fps"         value="6"/>
      <arg name="color_width"       value="640"/>
      <arg name="color_height"      value="480"/>
      <arg name="color_fps"         value="6"/>
      <arg name="enable_depth"      value="true"/>
      <arg name="enable_color"      value="true"/>
      <arg name="enable_infra1"     value="false"/>
      <arg name="enable_infra2"     value="false"/>
      <arg name="enable_fisheye"    value="false"/>
      <arg name="enable_gyro"       value="false"/>
      <arg name="enable_accel"      value="false"/>
      <arg name="enable_pointcloud" value="true"/>
      <arg name="enable_sync"       value="true"/>

@MartyG-RealSense
Copy link
Collaborator

Thanks so much @Samyak-ja-in for sharing with the RealSense ROS community the method that worked for you on Raspberry Pi 4!

@oysteinskotheim
Copy link

I have been having the same problem. I tried to compile the RealSense SDK version 2.43.0 and ROS drivers 2.2.23.

Now I get the following error if I try to enable the RGB image (same error both in RealSense viewer and using the ROS launch files):

26/10 14:54:33,521 ERROR [547776048928] (synthetic-stream.cpp:48) Exception was thrown during user processing callback: d rs2_deproject_pixel_to_point(float*, const rs2_intrinsics*, const float*, float)

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Oct 26, 2022

Hi @oysteinskotheim A RealSense user who had a Jetson TX2 and that exact error message resolved the problem in their particular case by re-flashing their JetPack to 4.4 and rebuilding librealsense, as described at IntelRealSense/librealsense#8546 (comment)

@MartyG-RealSense
Copy link
Collaborator

Hi @oysteinskotheim Was the advice in the comment above helpful to you, please? Thanks!

@Hypothesis-Z
Copy link

Hypothesis-Z commented Feb 21, 2023

Was there a commit that fixed this issue on Jetson? A camera could cause 25% cpu usage for my situation. It seems they are originated from the same bug.

@MartyG-RealSense
Copy link
Collaborator

Hi @Hypothesis-Z The internal Intel bug report for the Jetson-specific pointcloud issue remains open but there is not progress to report. Also, the issue would not receive a fix on the ROS1 wrapper (Kinetic, Melodic, Noetic) as development on that wrapper has ceased and it is now called ros-legacy.

25% CPU usage does not sound excessive though in terms of correctly working pointcloud generation on Jetson though when CUDA support is enabled.

@RezaOptimotive
Copy link

What are the latest versions for the wrapper and librealsense that work on jetson devices?

Also, would following this issue be the way to get notified once the bug fix is released?

@MartyG-RealSense
Copy link
Collaborator

Hi @RezaOptimotive This issue would be updated once a fix was released, yes.

The latest versions of librealsense and the ROS wrapper at the time of writing this that can be matched together are:

ROS1
librealsense 2.50.0 or 2.51.1 and wrapper 2.3.2

ROS2
librealsense 2.51.1 and wrapper 4.51.1

@RezaOptimotive
Copy link

Hi @MartyG-RealSense,

Thanks for getting back! I assume librealsense 2.51.1 and wrapper 4.51.1 for ROS2 don't actually publish point clouds correctly.

Based on your comment here you mentioned that librealsense 2.48.0 and ROS2 wrapper 3.2.2 works for jetsons. Is that the latest versions that work? Also would you know if those versions work for jetson nanos as well?

@MartyG-RealSense
Copy link
Collaborator

Using that old ROS2 wrapper 3.2.2 and 2.48.0 was the only known solution for ROS2 in the past, yes. Yes, any ROS wrapper version is compatible with Jetson.

I do not have a specific reference confirming that this combination works on Nano but cannot foresee a reason why it would not if it worked on another Jetson board model.

@AnOrdinaryUsser
Copy link

AnOrdinaryUsser commented Mar 14, 2023

I just wanted to share, for anyone else who has this problem, that the version that worked for me was v2.43.0 of librealsense built with CUDA and version 2.2.23 of the ROS wrapper.

It may work with more recent versions before 2.48.0 but I haven't tried them.

I am using a Jetson Xavier NX with jetpack 4.6

Hi I am new on ROS, and I don't know how to get realsense-ros version 2.2.21. I know I can get the source code in the github releases section, but I don't know how to install it.
I usually install the package using the apt install command. Can you tell me how to do it?

Because right now I have this installed:
librealsense2/bionic,now 2.41.0-5ubuntu51.gbp4f37f2 arm64 [installed, upgradable to: 2.53.1-0realsense0.703]
librealsense2-udev-rules/bionic,now 2.53.1-0~realsense0.703 arm64 [installed, automatic]
ros-melodic-librealsense2/bionic,now 2.50.0-1bionic.20211115.142954 arm64 [installed] ros-melodic-realsense2/bionic,now 2.50.0-1bionic.20211115.142954 arm64 [installed]
ros-melodic-realsense2-camera/bionic,now 2.3.2-1bionic.20221025.211354 arm64 [installed] ros-melodic-realsense2-camera/bionic,now 2.3.2-1bionic.20221025.211354 arm64 [installed]

And according to the issue #1967 (comment). I have the correct librealsense version but not realsense.

I am using a JETSON Nano

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Mar 14, 2023

Hi @AnOrdinaryUsser I would not recommend installing SDK 2.41.0 and wrapper 2.2.21 due to the significant age of these versions. As you are using ROS1 Melodic, an easier workaround than installing an old librealsense version may be to install the latest available SDK and ROS1 wrapper match-up and instead of using the launch file rs_camera.launch, use the rs_rgbd.launch file instead. On ROS1 this has been shown to successfully publish a pointcloud without the problems associated with pointcloud generation in an rs_camera launch on Jetson.

To perform an rs_rgbd launch in ROS1 Melodic, support for an RGBD launch must first be installed using the instruction below in the Ubuntu terminal.

sudo apt-get install ros-melodic-rgbd-launch

For users of Kinetic and Noetic, the installation command for those ROS versions can be found at #2092 (comment)

Once support for an RGBD launch is installed, use the roslaunch command below in the ROS terminal.

roslaunch realsense2_camera rs_rgbd.launch

rs_rgbd.launch should by default generate a point cloud that is published at depth_image_proc - you do not need to enable the pointcloud filter like you do in an rs_camera launch.

@AnOrdinaryUsser
Copy link

Hi @AnOrdinaryUsser I would not recommend installing SDK 2.41.0 and wrapper 2.2.21 due to the significant age of these versions. As you are using ROS1 Melodic, an easier workaround than installing an old librealsense version may be to install the latest available SDK and ROS1 wrapper match-up and instead of using the launch file rs_camera.launch, use the rs_rgbd.launch file instead. On ROS1 this has been shown to successfully publish a pointcloud without the problems associated with pointcloud generation in an rs_camera launch on Jetson.

To perform an rs_rgbd launch in ROS1 Melodic, support for an RGBD launch must first be installed using the instruction below in the Ubuntu terminal.

sudo apt-get install ros-melodic-rgbd-launch

For users of Kinetic and Noetic, the installation command for those ROS versions can be found at #2092 (comment)

Once support for an RGBD launch is installed, use the roslaunch command below in the ROS terminal.

roslaunch realsense2_camera rs_rgbd.launch

rs_rgbd.launch should by default generate a point cloud that is published at depth_image_proc - you do not need to enable the pointcloud filter like you do in an rs_camera launch.

Thanks for all. Now i have a point cloud working on my RVIZ.

@MartyG-RealSense
Copy link
Collaborator

That's great to hear, @AnOrdinaryUsser :)

@MartyG-RealSense
Copy link
Collaborator

This issue is now being closed for the following reasons.

  1. The RealSense ROS1 wrapper no longer receives updates and has been renamed ros1-legacy.

  2. A correctly working pointcloud can be generated in ROS1 using the rs_rgbd.launch launchfile instead of rs_camera.launch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests