-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gyro values keeps moving? #10371
Comments
Hi @MyOtherNamesWereTaken Have you performed a calibration of the IMU using Intel's Python calibration tool at the link below and then saved that calibration to storage inside the camera, please? https://github.com/IntelRealSense/librealsense/tree/master/tools/rs-imu-calibration |
waiting for realsense device... Error: (25, 'Inappropriate ioctl for device') Process finished with exit code 0 this is what im getting when i try to run it |
That is the problem that you reported a week ago at #10356 when asking about accel. Did you try an RSUSB build of librealsense as suggested in that case, please? |
How would i go on about doing that? I am a little clueless with this. Also how do i check if im using the method? |
When you built librealsense the last time, did you build it from source code using CMake and build the Python wrapper at the same time by including the CMake build term -DBUILD_PYTHON_BINDINGS:bool=true If so then you can build librealsense in RSUSB mode by adding the term -DFORCE_RSUSB_BACKEND=true to the CMake build instruction. For example: cmake ../ -DFORCE_RSUSB_BACKEND=true -DBUILD_PYTHON_BINDINGS:bool=true -DCMAKE_BUILD_TYPE=release -DBUILD_EXAMPLES=true -DBUILD_GRAPHICAL_EXAMPLES=true When building with RSUSB, you do not need to apply a patch to the kernel before you build librealsense. In regard to checking if you were previously using RSUSB: if you built librealsense from source code on a PC and did not use -DFORCE_RSUSB_BACKEND=true then the build was not using RSUSB. |
I just used import pyrealsense2 as rs? |
On Linux the Python pyrealsense2 wrapper has to be built by some method (pip install pyrealsense2, building it from source code on its own, or building librealsense and the wrapper together from source code at the same time). A pyrealsense2 script should not work if the wrapper has not been built. For example, if librealsense was built from packages instead of source code then it should not have pyrealsense2 included. Can you describe the method that you use to install librealsense, please? |
It was build by someone prior to me on Linux with the pip install pyrealsense2 is what ive been told if that helps..? |
That is helpful, yes. The easiest way to perform a librealsense build that has pyrealsense2 support may therefore be:
https://github.com/IntelRealSense/librealsense/blob/master/doc/libuvc_installation.md
|
So i did step 1-5 from the link u send me and then did the pip install but it still gives me the same error. |
Okay, thanks for testing. So the librealsense SDK is installing correctly but the problem is occurring when installing the pyrealsense2 wrapper afterwards? |
As in after trying to run the calibration tool im still getting the same error as i did before. Did i do something wrong in the process? Sorry for being this clueless. |
Don't worry, it's no trouble at all. If you installed librealsense using the libuvc_installation.md instructions then your librealsense build should now be based on RSUSB. However, you are still experiencing the Inappropriate ioctl for device error. This would suggest that the problem is not related to the kernel, as RSUSB should be bypassing the kernel. I have not used this calibration tool myself. My interpretation of the instructions for running the calibration script is as follows though:
https://github.com/IntelRealSense/librealsense/releases/tag/v2.50.0
librealsense-2.50.0 > tools > rs-imu-calibration In this folder should be the script file rs-imu-calibration.py
python rs-imu-calibration.py Is the above procedure similar to what you have been doing, please? |
Thank you so much! It actually opens and does stuff now. Now I just go by the readme file included in the rs-imu-calibration folder, right? |
Great news that you made significant progress! Yes, you can use that readme. In my opinion though, the PDF version of the guide with more detailed instructions provides a clearer explanation of each step of the calibration process. You can find it at page 16 onwards of the PDF. https://dev.intelrealsense.com/docs/imu-calibration-tool-for-intel-realsense-depth-camera |
Thank you so much. I managed to calibrate it thanks to you. Am i supposed to save the calibration files in a certain spot ? Or just the folder the .py is in? |
Once all stages of calibration have been completed and the tool asks if you want to write the calibration, say Yes to store it inside the camera hardware (not on the computer). Otherwise the calibration will be lost when you close the tool and you will have to repeat the calibration. |
So i did save it in the camera folder, however my acceleration is still measured at 9.41 and when comparing the results to those without calibration it seems like nothing has changed. Am i supposed to move the build folder into my projects folder where my python code is? |
I believe that you should use the instruction RS2_OPTION_ENABLE_MOTION_CORRECTION to enable motion data correction now that you have stored an IMU calibration inside the camera hardware. The RealSense SDK's rs_imu_calibration.py script provides an example of setting the status of this option in Python, though you should set the option to '1' (enabled) instead of '0' (disabled). |
That fixed the issue, thanks! Do you happen to know how accurate the gyro & accelerometer are on this device? Nevermind, it seems that when putting it on different axis it measures different values for the acceleration, Varrying between 9.4 and 10.1 m/s² Just to clarify: Calibration data was supposed to be saved to disk, aka entering nothing when the terminal asked for a footer? |
When rs_imu_calibration.py asks if you want to store the IMU calibration at the end of the calibration process and you select Yes, it is saved to an internal permanent storage space inside the camera and not to a file on the computer. If the calibration is not saved to the camera then it will be lost when the calibration program is closed or you perform another calibration. |
I just recalibrated it, however im still facing these issues. I also enable motion correction, which did help. |
IMU data can be noisy and sensitive. It is possible to write code to weight the data towards the gyro instead of the accelerometer to make it less sensitive. There is only a C++ example of this though, in the SDK's rs-motion IMU program. If your problem is with the gyro sensitivity though then you can also do the opposite and weight the data towards the accelerometer. |
Also, when using it like this context = rs.context() My IDE says that the motion correction statement seems to have no effect? |
The script does not seem to be informing the camera that it should be accessing the IMU component of the camera using imu_sensor.set_option to set the motion correction, as the IMU component is separate from the depth sensor inside the camera. If the depth sensor (the default sensor) is not the sensor being accessed then the script therefore needs to define which of the three supported separate stream sensor components in the D435i (depth, RGB, IMU) to access.
|
It says that self is an unresolved reference, however when i import it, it says that it cant find the imu_sensor reference |
Thanks very much for your patience while I conducted further research. It seems that RS2_OPTION_ENABLE_MOTION_CORRECTION is a software setting that tells the script whether or not to correct the motion data and not an option that needs to access the IMU sensor. So your original implementation rs.option.enable_motion_correction, 1 was likely correct. I do apologize. |
What would be the best way to go on about removing gravity from the accelerometer? |
IntelRealSense/realsense-ros#2268 (comment) describes how VINS-Fusion, which some RealSense ROS D435i owners have used, has a parameter called g_norm to adjust the gravity magnitude value. I understand that the robot_localization ROS module used in Intel's D435i SLAM guide for ROS can also provide gravity removal. An example reference for this is at cra-ros-pkg/robot_localization#501 This is a Python case rather than a RealSense ROS one though. You may be able to dampen the effect of the accelerometer's y-gravity by using a Python adaptation of the rs-motion technique mentioned earlier, but weighting the data strongly towards the gyro can result in drift in the data. |
So theres no Python implementation to get a proper rotation matrix? or any sort of method to filter out the gravity? |
There is not a gravity removal mechanism for the IMU in librealsense. A RealSense team member advises in #6385 (comment) that whilst you may be able to obtain the IMU rotation matrix using the gyro and accel data, the data will be noisy and so the RealSense T265 Tracking Camera model (which is now retired) is better suited to this task. In #4391 (comment) the same RealSense team member advises to a user who created a Python adaptation of the C++ rs-motion program that "in certain situations you can extract orientation roll and pitch from the acceleration values and use them to compensate for the gyro drifts. But this is only possible if (a) you have the IMU unit perfectly calibrated, and (b) the sum of forces applied to the camera equals G (e.g the device's is either completely static or moves with a constant linear velocity)". |
So: in other words theres no way to accurately (aka +/- 5cm) track position just with gyro and accelerometer? |
Even with a specialized T265 Tracking Camera (see the link below) it is unlikely that you would achieve that very fine degree of position tracking accuracy, so for a D435i I would say no. https://support.intelrealsense.com/hc/en-us/community/posts/360036372154-T265-Tracking-Accuracy |
Thanks, ill have to find a different method to make this work. Is there any sort of implementation in the realsense library for visual odometry? I couldn't find anything related to Python |
There is support for visual odometry in the RealSense SDK but it is for the T265. https://www.intelrealsense.com/visual-inertial-tracking-case-study/ A T265 can be paired with a 400 Series camera to provide Visual SLAM navigation by the T265 whilst the 400 Series camera provides depth sensing, as the T265 does not have built-in depth sensing. https://www.intelrealsense.com/depth-and-tracking-combined-get-started/ |
So the only way to achieve visual odometry with my d435i is with pairing it to a T265? |
The commercial software package SLAMcore is compatible with D435i and D455. https://www.slamcore.com/spatial-intelligence-sdk A RealSense user performed visual odometry SLAM with a D435i and RTABMAP in the YouTube video below. Others have used RTABMAP with D435i for visual odometry too. https://www.youtube.com/watch?v=0M_Hc0pdgcY My understanding is that T265 is the preferred method though due to the noisiness of the D435i IMU data. |
Hi @MyOtherNamesWereTaken Do you require further assistance with this case, please? Thanks! |
Hi, guy. I need your help. When I run the rs-imu-calibration, it asks me : Would you like to write the results to the camera?(Y/N) |
Hi @JACKLiuDay As you received a solution at #10587 (comment) do you still need help with the above question please? |
Camera Model | { D435i }
Firmware Version | (05.13.00.50)
Operating System & Version | { Linux (Ubuntu 14/16/17)
Kernel Version (Linux Only) | (5.11.0 -38-generic )
Platform | PC
Language | {python}
Segment | {Robot/Smartphone/VR/AR/others }
For some reason my Gyro value on Y keeps going up while X & Z go down. Camera lays on a flat & steady surface with the Z axis pointing upwards.
try:
last_frame = None
while True:
f = pipeline.wait_for_frames()
for frame in f:
# gather IMU data
if frame.is_motion_frame():
if frame.get_profile().stream_type() == rs.stream.gyro:
gyro = np.array([gyro_data(frame.as_motion_frame().get_motion_data())]).T
if frame.get_profile().stream_type() == rs.stream.accel:
accelwithg = np.array([accel_data(frame.as_motion_frame().get_motion_data())]).T
The text was updated successfully, but these errors were encountered: