Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is alignment or filtering algorithm done by SW(not rs SOC) in librealsense? #11677

Closed
muonkmu opened this issue Apr 14, 2023 · 2 comments
Closed

Comments

@muonkmu
Copy link

muonkmu commented Apr 14, 2023

Hi, I'm a beginner in systems engineering and I'm not familiar with the camera system. Our team is designing a robot and using the system below:

  • HOST: x64 - Intel i7 without GPU
  • OS: Ubuntu 20.04
  • Sensor: D435 and others

We are applying the following options to Librealsense to get an image:

  • RS2_FORMAT_BGR8/Z16
  • Decimation Filter
  • Holes Filling filter
  • align_to(RS2_STREAM_COLOR)

Due to many algorithms running on the host PC, we have faced a shortage of computing resources. I would like to know the following:

  • RealSense's RGB camera only has BAYER/YUYV output. Does Librealsense perform RGB conversion using software?
  • Whether the filter application or alignment calculation is also performed in software by Librealsense, not hardware inside the RealSense camera?
  • Whether the above is used to accelerate the process in RealSense to hardware?

We are also considering accelerators with FPGA to process YUYV images and Z16 data received directly from V4l2. If the filter application or alignment calculation is processed in software, can I check the code for this?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 14, 2023

Hi @muonkmu Alignment and filters are processed on the computer CPU and not the camera hardware. RealSense 400 Series cameras contain Vision Processor D4 hardware to process data and so enable the camera to be used with low-end computers / computing devices without a strong GPU, but alignment and filters will place a burden on the CPU.

The RealSense SDK also supports 'headless' text-based camera applications and tools that do not have a requirement for graphics support.


The raw RGB of RealSense cameras is bayered YUY but the visual stream that is output by the RealSense SDK can be in a range of supported formats such as RGB8, BGR8 and YUYV.

image

Conversion of raw frames is handled by the camera's Vision Processor D4 hardware.


The RealSense SDK has in-built support for GLSL Processing Blocks which are 'vendor neutral' (they should work with any GPU brand) and accelerate processing by offloading work from the CPU onto the GPU. GLSL is best used with the C++ programming language. It may also not provide noticable improvement when used on low-end computers / computing devices.

#3654 discusses the advantages and disadvantages of using GLSL and how to apply it.

The RealSense SDK also has a C++ example program for GLSL.

https://github.com/IntelRealSense/librealsense/tree/master/examples/gl


The SDK also has support for CUDA graphics acceleration for computers / computing devices with Nvidia GPUs, such as the Nvidia Jetson range of Arm architecture computing boards. CUDA provides automatic acceleration for three specific types of operation: YUY to RGB color conversion, alignment and pointclouds.


Standard Linux tools can receive RealSense camera data if the SDK is built with the V4L2 Backend (its default backend). It will not be nicely colored though compared to the stream in a RealSense application.


The SDK's code for filters and alignment can be found in the src/proc folder of its source code.

https://github.com/IntelRealSense/librealsense/tree/master/src/proc

@muonkmu
Copy link
Author

muonkmu commented Apr 17, 2023

Thank you for your reply. It was helpful to me.

@muonkmu muonkmu closed this as completed Apr 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants