Skip to content

Latest commit

 

History

History
231 lines (136 loc) · 7.51 KB

README-ENG.md

File metadata and controls

231 lines (136 loc) · 7.51 KB

DeepStream: A Demo for LiDAR-Camera Fusion

DeepStream is an embedded mobile platform equipped with  Livox Horizon LiDAR and  HIK camera. This platform is developed on Nvidia Jetson Xavier NX. It will be used to deploy multi-sensor based algorithms in the future.

Full Demonstration - Video

Content

Environment

  1. OS:Ubuntu 20.04,zsh(oh-my-zsh)

  2. Install PyQT5,ROS noetic,anaconda3 (python3.6.13)

  3. Install Livox ROS Driver: Please refer official instruction

    If install is successful, the dir should be like:

    |--- path_to_livox_driver/
    |--- devel/
    |--- src/
    |--- build/
    cd src/
    

    Set xfer_format as 2 in the livox_ros_driver/launch/livox_lidar.launch (details could be find in the official instruction of livox driver)

  4. Install HIK camera driver:

    Enter offical web, and download and extract 机器视觉工业相机客户端MVS V2.1.0 (Linux).

    Select the appropriate version to install according to your needs, e.g. MVS-2.1.0_x86_64_20201228.deb

    sudo dpkg -i MVS-2.1.0_x86_64_20201228.deb

    Note: after installation, you could find related files under /opt/MVS/Samples/64/Python/, please check this path is correct, or you need to change the imported package path in the path_to_deepstream/deepstream/hik_cam/hik_cam_linux.py.

Calibration Process

We conduct LiDAR-Camera calibration according to instruction.

1. Requirement

image-20210804093135514

2. LiDAR Camera Assembly

The base connecting the LiDAR to the camera is acrylic plate, you can refer to the following design for customization.

image-20220407143315267 image-20220407143751534

To mimumize the parameters adjustment during the calibration, it's suggested to follow the following assembly way:

08ebc2e5fb6afa7ba0d785ca4da3b7b

3. Calibrate Camera Intrinsic and Distortion Matrix

Calibration process(Zhang Zhengyou calibration method):

  1. Initial corner detection
  2. Further extract sub-pixel corners
  3. Draw corner observations
  4. Camera calibration
  5. Re-project 3D points in space to evaluate calibration effect

Data collection:

We use an 85-inch TV under dark conditions to display the calibration pictures, and collecte 35 frames of pictures from various viewpoints:

Calibration result -- reprojection error:

0.014394424473247082 (error within the allowable range)

Distortion correction results:

Note:

  1. Zhang's calibration method is very sensitive to environment. Therefore, we use TV screen display under dark condition as calibration plate.
  2. The shooting angle is not diverse enough, and the number of pictures taken is insufficient, both will result in bad accuracy.
  3. Exposure intensity needs to be carefully adjusted during shooting, otherwise corner points cannot be captured.

4. LiDAR Camera Calibration

Calibration method:

Camera LiDAR calibration algorithm based on plane constraint.

Data collection:

We use an 1m x 1.5m TV as the calibration board. The distance between sensors and TV is about 3 meters. And collect 15 groups of data.

Note:

  1. It is very important to overlap multiple frames point cloud data, aka you need to record the .bag file for bout 10s, too sparse point cloud will damage the calibration accuracy greatly.
  2. The accuracy of manually marked corner points of the image is insufficient, resulting in the insufficient accuracy of the annotated image Angle, which affects the accuracy of the external reference annotation. Sub-pixel optimization can be used to further optimize the manually selected pixels to solve the problem.

Calibration effect:

5. Parameters Setting

Save all params under dir params:

├─params
│  ├─intrinsic.txt
│  ├─extrinsic.txt
│  └─distort.txt
├─README.md

Example:

  • distort.txt

    -0.09732168794279916 0.1023514279653091 0.0003060722976587038 0.0004451042048329594 -0.01596420627741748 
    
  • intrinsic.txt

    1723.253969255113 0 1521.018607274423 0 0 1723.714922079786 1029.277181487616 0 0 0 1 0
    
  • extrinsic.txt

    0.0151031  -0.999863  -0.0068105  -0.0239579 -0.01768  0.00654316  -0.999822  0.0519217 0.99973  0.0152208  -0.0175788  -0.0108779 0  0  0
    

    Note: These params cannot be perfectly adapted in your case, due to equipment and assembly difference, so you need to calibrate yourself.

Running Process

  1. Open linux terminal,start ROS

    roscore
  2. Connect your laptop or development board with camera by usb.

  3. Connect your laptop or development board with LiDAR by network port, and change IP to 192.168.1.50, more details could be found in the offcial instruction.

  4. Run LiDAR driver

    cd path_to_livox_driver/
    source devel/setup.zsh # if you use bash,you should replace it with devel/setup.bash
    roslaunch livox_ros_driver livox_lidar_msg.launch
  5. Change camera-related path in deepstream/hik_cam.py.

    # append dll path
    ## aarch64 
    #sys.path.append("/opt/MVS/Samples/aarch64/Python/MvImport") 
    ## 64-bit Linux
    sys.path.append("/opt/MVS/Samples/64/Python/MvImport") 
  6. Run the main program. If you use the development board, please select the dual core running mode, or it will very slow.

    python3 application.py
    

Co-developer