Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gyro values keeps moving? #10371

Closed
MyOtherNamesWereTaken opened this issue Apr 6, 2022 · 40 comments
Closed

Gyro values keeps moving? #10371

MyOtherNamesWereTaken opened this issue Apr 6, 2022 · 40 comments

Comments

@MyOtherNamesWereTaken
Copy link

MyOtherNamesWereTaken commented Apr 6, 2022

Camera Model | { D435i }
Firmware Version | (05.13.00.50)
Operating System & Version | { Linux (Ubuntu 14/16/17)
Kernel Version (Linux Only) | (5.11.0 -38-generic )
Platform | PC
Language | {python}
Segment | {Robot/Smartphone/VR/AR/others }

For some reason my Gyro value on Y keeps going up while X & Z go down. Camera lays on a flat & steady surface with the Z axis pointing upwards.
try:
last_frame = None
while True:
f = pipeline.wait_for_frames()
for frame in f:
# gather IMU data
if frame.is_motion_frame():
if frame.get_profile().stream_type() == rs.stream.gyro:
gyro = np.array([gyro_data(frame.as_motion_frame().get_motion_data())]).T
if frame.get_profile().stream_type() == rs.stream.accel:
accelwithg = np.array([accel_data(frame.as_motion_frame().get_motion_data())]).T

            ts = f.get_timestamp()

   if first:
        first = False
        last_ts_gyro = ts
        continue

    if ts == last_ts_gyro:
        continue

    ## calculation for the second frame onwards
    # gyro-meter calculations
    dt = (ts - last_ts_gyro) / 1000
    last_ts_gyro = ts

    gyro_angle_x = gyro[0] * dt
    gyro_angle_y = gyro[1] * dt
    gyro_angle_z = gyro[2] * dt
    # totals
    total_x += gyro_angle_x
    total_y += gyro_angle_y
    total_z += gyro_angle_z

    # rad = > degree
    dtotal_x = np.rad2deg(total_x) % 360
    dtotal_y = np.rad2deg(total_y) % 360
    dtotal_z = np.rad2deg(total_z) % 360
@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 6, 2022

Hi @MyOtherNamesWereTaken Have you performed a calibration of the IMU using Intel's Python calibration tool at the link below and then saved that calibration to storage inside the camera, please?

https://github.com/IntelRealSense/librealsense/tree/master/tools/rs-imu-calibration

@MyOtherNamesWereTaken
Copy link
Author

MyOtherNamesWereTaken commented Apr 6, 2022

waiting for realsense device...
Device PID: 0B3A
Device name: Intel RealSense D435I
Serial number: *
Product Line: D400
Firmware version: 05.13.00.50
Start interactive mode:

Error: (25, 'Inappropriate ioctl for device')

Process finished with exit code 0

this is what im getting when i try to run it

@MartyG-RealSense
Copy link
Collaborator

That is the problem that you reported a week ago at #10356 when asking about accel. Did you try an RSUSB build of librealsense as suggested in that case, please?

@MyOtherNamesWereTaken
Copy link
Author

MyOtherNamesWereTaken commented Apr 6, 2022

How would i go on about doing that? I am a little clueless with this. Also how do i check if im using the method?

@MartyG-RealSense
Copy link
Collaborator

When you built librealsense the last time, did you build it from source code using CMake and build the Python wrapper at the same time by including the CMake build term -DBUILD_PYTHON_BINDINGS:bool=true

If so then you can build librealsense in RSUSB mode by adding the term -DFORCE_RSUSB_BACKEND=true to the CMake build instruction. For example:

cmake ../ -DFORCE_RSUSB_BACKEND=true -DBUILD_PYTHON_BINDINGS:bool=true -DCMAKE_BUILD_TYPE=release -DBUILD_EXAMPLES=true -DBUILD_GRAPHICAL_EXAMPLES=true

When building with RSUSB, you do not need to apply a patch to the kernel before you build librealsense.

In regard to checking if you were previously using RSUSB: if you built librealsense from source code on a PC and did not use -DFORCE_RSUSB_BACKEND=true then the build was not using RSUSB.

@MyOtherNamesWereTaken
Copy link
Author

I just used import pyrealsense2 as rs?

@MartyG-RealSense
Copy link
Collaborator

On Linux the Python pyrealsense2 wrapper has to be built by some method (pip install pyrealsense2, building it from source code on its own, or building librealsense and the wrapper together from source code at the same time). A pyrealsense2 script should not work if the wrapper has not been built.

For example, if librealsense was built from packages instead of source code then it should not have pyrealsense2 included.

Can you describe the method that you use to install librealsense, please?

@MyOtherNamesWereTaken
Copy link
Author

It was build by someone prior to me on Linux with the pip install pyrealsense2 is what ive been told if that helps..?

@MartyG-RealSense
Copy link
Collaborator

That is helpful, yes.

The easiest way to perform a librealsense build that has pyrealsense2 support may therefore be:

  1. Build the SDK without pyrealsense2 included using the simple libuvc backend method in the link below.

https://github.com/IntelRealSense/librealsense/blob/master/doc/libuvc_installation.md

  1. Once the SDK as been successfully installed, perform the pip install pyrealsense2 procedure to add the pyrealsense2 wrapper.

@MyOtherNamesWereTaken
Copy link
Author

So i did step 1-5 from the link u send me and then did the pip install but it still gives me the same error.

@MartyG-RealSense
Copy link
Collaborator

Okay, thanks for testing. So the librealsense SDK is installing correctly but the problem is occurring when installing the pyrealsense2 wrapper afterwards?

@MyOtherNamesWereTaken
Copy link
Author

As in after trying to run the calibration tool im still getting the same error as i did before. Did i do something wrong in the process? Sorry for being this clueless.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 7, 2022

Don't worry, it's no trouble at all.

If you installed librealsense using the libuvc_installation.md instructions then your librealsense build should now be based on RSUSB. However, you are still experiencing the Inappropriate ioctl for device error. This would suggest that the problem is not related to the kernel, as RSUSB should be bypassing the kernel.

I have not used this calibration tool myself. My interpretation of the instructions for running the calibration script is as follows though:

  1. Download the source code of the librealsense SDK. This can be done from the 'Assets' file list of the SDK releases page by clicking on the Source code (zip) link.

https://github.com/IntelRealSense/librealsense/releases/tag/v2.50.0

image

  1. Once the source code zip file is downloaded, extract the librealsense folder named librealsense-2.50.0 from it and navigate to the following folder location:

librealsense-2.50.0 > tools > rs-imu-calibration

In this folder should be the script file rs-imu-calibration.py

  1. Whilst in this folder, input the Python command below to launch the script.

python rs-imu-calibration.py


Is the above procedure similar to what you have been doing, please?

@MyOtherNamesWereTaken
Copy link
Author

Thank you so much! It actually opens and does stuff now. Now I just go by the readme file included in the rs-imu-calibration folder, right?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 7, 2022

Great news that you made significant progress!

Yes, you can use that readme. In my opinion though, the PDF version of the guide with more detailed instructions provides a clearer explanation of each step of the calibration process. You can find it at page 16 onwards of the PDF.

https://dev.intelrealsense.com/docs/imu-calibration-tool-for-intel-realsense-depth-camera

@MyOtherNamesWereTaken
Copy link
Author

Thank you so much. I managed to calibrate it thanks to you. Am i supposed to save the calibration files in a certain spot ? Or just the folder the .py is in?

@MartyG-RealSense
Copy link
Collaborator

Once all stages of calibration have been completed and the tool asks if you want to write the calibration, say Yes to store it inside the camera hardware (not on the computer). Otherwise the calibration will be lost when you close the tool and you will have to repeat the calibration.

@MyOtherNamesWereTaken
Copy link
Author

So i did save it in the camera folder, however my acceleration is still measured at 9.41 and when comparing the results to those without calibration it seems like nothing has changed. Am i supposed to move the build folder into my projects folder where my python code is?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 11, 2022

I believe that you should use the instruction RS2_OPTION_ENABLE_MOTION_CORRECTION to enable motion data correction now that you have stored an IMU calibration inside the camera hardware.

The RealSense SDK's rs_imu_calibration.py script provides an example of setting the status of this option in Python, though you should set the option to '1' (enabled) instead of '0' (disabled).

https://github.com/IntelRealSense/librealsense/blob/master/tools/rs-imu-calibration/rs-imu-calibration.py#L264-L266

@MyOtherNamesWereTaken
Copy link
Author

MyOtherNamesWereTaken commented Apr 11, 2022

That fixed the issue, thanks! Do you happen to know how accurate the gyro & accelerometer are on this device?

Nevermind, it seems that when putting it on different axis it measures different values for the acceleration, Varrying between 9.4 and 10.1 m/s²

Just to clarify: Calibration data was supposed to be saved to disk, aka entering nothing when the terminal asked for a footer?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 11, 2022

When rs_imu_calibration.py asks if you want to store the IMU calibration at the end of the calibration process and you select Yes, it is saved to an internal permanent storage space inside the camera and not to a file on the computer. If the calibration is not saved to the camera then it will be lost when the calibration program is closed or you perform another calibration.

@MyOtherNamesWereTaken
Copy link
Author

MyOtherNamesWereTaken commented Apr 11, 2022

Do you happen to know how accurate the gyro & accelerometer are on this device?

Nevermind, it seems that when putting it on different axis it measures different values for the acceleration, Varrying between 9.4 and 10.1 m/s²

I just recalibrated it, however im still facing these issues. I also enable motion correction, which did help.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 11, 2022

IMU data can be noisy and sensitive. It is possible to write code to weight the data towards the gyro instead of the accelerometer to make it less sensitive. There is only a C++ example of this though, in the SDK's rs-motion IMU program.

If your problem is with the gyro sensitivity though then you can also do the opposite and weight the data towards the accelerometer.

https://github.com/IntelRealSense/librealsense/blob/master/examples/motion/rs-motion.cpp#L117-L118

@MyOtherNamesWereTaken
Copy link
Author

Also, when using it like this

context = rs.context()
pipeline = rs.pipeline(context)
config = rs.config()
rs.option.enable_motion_correction, 1
config.enable_stream(rs.stream.gyro, rs.format.motion_xyz32f, 200)
config.enable_stream(rs.stream.accel, rs.format.motion_xyz32f, 63)
profile = pipeline.start(config)

My IDE says that the motion correction statement seems to have no effect?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 11, 2022

The script does not seem to be informing the camera that it should be accessing the IMU component of the camera using imu_sensor.set_option to set the motion correction, as the IMU component is separate from the depth sensor inside the camera. If the depth sensor (the default sensor) is not the sensor being accessed then the script therefore needs to define which of the three supported separate stream sensor components in the D435i (depth, RGB, IMU) to access.

self.imu_sensor.set_option(rs.option.enable_motion_correction, 1)

@MyOtherNamesWereTaken
Copy link
Author

It says that self is an unresolved reference, however when i import it, it says that it cant find the imu_sensor reference

@MartyG-RealSense
Copy link
Collaborator

Thanks very much for your patience while I conducted further research. It seems that RS2_OPTION_ENABLE_MOTION_CORRECTION is a software setting that tells the script whether or not to correct the motion data and not an option that needs to access the IMU sensor. So your original implementation rs.option.enable_motion_correction, 1 was likely correct. I do apologize.

@MyOtherNamesWereTaken
Copy link
Author

What would be the best way to go on about removing gravity from the accelerometer?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 11, 2022

IntelRealSense/realsense-ros#2268 (comment) describes how VINS-Fusion, which some RealSense ROS D435i owners have used, has a parameter called g_norm to adjust the gravity magnitude value.

I understand that the robot_localization ROS module used in Intel's D435i SLAM guide for ROS can also provide gravity removal. An example reference for this is at cra-ros-pkg/robot_localization#501

This is a Python case rather than a RealSense ROS one though.

You may be able to dampen the effect of the accelerometer's y-gravity by using a Python adaptation of the rs-motion technique mentioned earlier, but weighting the data strongly towards the gyro can result in drift in the data.

@MyOtherNamesWereTaken
Copy link
Author

So theres no Python implementation to get a proper rotation matrix? or any sort of method to filter out the gravity?

@MartyG-RealSense
Copy link
Collaborator

There is not a gravity removal mechanism for the IMU in librealsense.

A RealSense team member advises in #6385 (comment) that whilst you may be able to obtain the IMU rotation matrix using the gyro and accel data, the data will be noisy and so the RealSense T265 Tracking Camera model (which is now retired) is better suited to this task.

In #4391 (comment) the same RealSense team member advises to a user who created a Python adaptation of the C++ rs-motion program that "in certain situations you can extract orientation roll and pitch from the acceleration values and use them to compensate for the gyro drifts. But this is only possible if (a) you have the IMU unit perfectly calibrated, and (b) the sum of forces applied to the camera equals G (e.g the device's is either completely static or moves with a constant linear velocity)".

@MyOtherNamesWereTaken
Copy link
Author

MyOtherNamesWereTaken commented Apr 11, 2022

So: in other words theres no way to accurately (aka +/- 5cm) track position just with gyro and accelerometer?

@MartyG-RealSense
Copy link
Collaborator

Even with a specialized T265 Tracking Camera (see the link below) it is unlikely that you would achieve that very fine degree of position tracking accuracy, so for a D435i I would say no.

https://support.intelrealsense.com/hc/en-us/community/posts/360036372154-T265-Tracking-Accuracy

@MyOtherNamesWereTaken
Copy link
Author

Thanks, ill have to find a different method to make this work.

Is there any sort of implementation in the realsense library for visual odometry? I couldn't find anything related to Python

@MartyG-RealSense
Copy link
Collaborator

There is support for visual odometry in the RealSense SDK but it is for the T265.

https://www.intelrealsense.com/visual-inertial-tracking-case-study/

A T265 can be paired with a 400 Series camera to provide Visual SLAM navigation by the T265 whilst the 400 Series camera provides depth sensing, as the T265 does not have built-in depth sensing.

https://www.intelrealsense.com/depth-and-tracking-combined-get-started/

@MyOtherNamesWereTaken
Copy link
Author

So the only way to achieve visual odometry with my d435i is with pairing it to a T265?

@MartyG-RealSense
Copy link
Collaborator

The commercial software package SLAMcore is compatible with D435i and D455.

https://www.slamcore.com/spatial-intelligence-sdk
https://www.youtube.com/watch?v=TqgwCGrqGAM

A RealSense user performed visual odometry SLAM with a D435i and RTABMAP in the YouTube video below. Others have used RTABMAP with D435i for visual odometry too.

https://www.youtube.com/watch?v=0M_Hc0pdgcY

My understanding is that T265 is the preferred method though due to the noisiness of the D435i IMU data.

@MartyG-RealSense
Copy link
Collaborator

Hi @MyOtherNamesWereTaken Do you require further assistance with this case, please? Thanks!

@JACKLiuDay
Copy link

Hi @MyOtherNamesWereTaken Do you require further assistance with this case, please? Thanks!

Hi, guy. I need your help. When I run the rs-imu-calibration, it asks me : Would you like to write the results to the camera?(Y/N)
I chose y. But it shows : Writing calibration to device.
Then: Error : failed to set power state.

@MartyG-RealSense
Copy link
Collaborator

Hi @JACKLiuDay As you received a solution at #10587 (comment) do you still need help with the above question please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants