-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to get rotation matrix or angles #7907
Comments
Hi @ori30ffs
https://www.intelrealsense.com/how-to-getting-imu-data-from-d435i-and-t265/
You could alternatively make use of the SDK's rs-convert tool in Windows to extract frames from a bag in a range of formats. If you are not able to compile its C++, the Windows version of the RealSense SDK provides a pre-built executable version of the tool called rs-convert.exe in the SDK Tools folder. Further information about launching and controlling rs-convert can be found in the link below. https://support.intelrealsense.com/hc/en-us/community/posts/360051132573/comments/360014080493 |
Hello @MartyG-RealSense
I expect near values in each frame, but X and Z angles has strong hesitation. Can you help me to understant the physics of it? |
I looked carefully at your problem with X and Z and the script that you used. Could you try using a lower value for alpha in this line please: alpha = 0.98 A high alpha weights the calculations towards the gyro but can cause drift. Reducing the alpha weights the calculations towards the accelerometer but makes it more sensitive. As the camera is static though, sensitivity may not matter so much in this case. |
I tried it. This has no effect. |
I recall a case about the RealSense T265 Tracking Camera (which uses the same IMU component) where its values were drifting at start-up even when stationary, and it was advised to give the IMU enough initial motion to help it to understand its local position. That may not be an option that you can easily test in your case though if the D435i is rigidly mounted instead of being on a tripod. It may be worth calibrating your IMU with the Python-based IMU calibration tool, which Dorodnic the RealSense SDK Manager has confirmed is compatible with D455. |
Okay, I'll calibrate it. Please note, we have this problem with two cameras. |
The RGB and left and right sensors are calibrated in the factory. Originally, the 400 Series IMU was not calibrated in the factory and had to be user-calibrated after purchase with the Python tool. I have heard differing information about whether the IMU is now factory-calibrated too or not. Nevertheless, doing your own IMU calibration is a good course of action if you are not satisfied with the values being provided by it. |
Hi @ori30ffs Do you require further assistance with this case, please? Thanks! |
Case closed due to no further comments received. |
Hello everybody. I want to change the referential. From camera referential to a fixe referential as a room for example. PL |
Hi @PierreLouisT Please do provide further information about what you are trying to achieve. The IMU and its acceleration and gravity values have no effect on how a depth pointcloud behaves. If the camera is rotated then whatever direction the front of the camera is facing in will be the Z axis that represents depth / distance. Are you hoping to have another axis such as Y represent distance (the direction that the camera is facing) instead? Thanks very much for your patience! |
Hi Marty ! Absolutely, that is what I need to do. I tried to apply Rotation to project the point cloud to another referential without success. The main goal is to reconstruct 3D image from multiples cameras in a location differently oriented from others. From multiples cameras I wish to get one point cloud in one referential |
@PierreLouisT If you are able to use ROS then Intel have a guide for stitching together pointclouds from multiple RealSense cameras facing in different directions, both with cameras attached to a single computer or two computers. https://www.intelrealsense.com/how-to-multiple-camera-setup-with-ros/ Alternatively, a RealSense ROS user at IntelRealSense/realsense-ros#1176 (comment) also suggested an edit to the RealSense ROS wrapper's code to change how the IMU axes are represented. As mentioned above though, the IMU would not have an influence on point clouds. If your project requires the use of Python to align multiple clouds then the RealSense SDK's rs2_transform_point_to_point instruction is worth investigating. More information about this can be found at #7853 (comment) |
Hi ! Is there any way for l515 to know the roll angle, yaw angle and pitch angle of the camera relative to the initial position using C++ ? |
Hi @FeiSong123 The RealSense SDK's C++ IMU example program rs-motion, which works with L515, may be a useful reference for you. https://github.com/IntelRealSense/librealsense/tree/master/examples/motion Lines of the rs-motion.cpp code for retrieving roll, pitch and yaw values are highlighted here: |
Issue Description
Hello,
I was trying to get rotation information from gyro stream. I need it for rotate my pointcloud.
I refer to this page rs-motion. But unfortunately I don't understand how gyro values are changed.
In my experiments I was rotate camera around one dimensions, but all 3 values was changed
Now I'm using static camera but I still need to find ita rotation.
Q1: What gyro stream values is? Is it some raw data?
Q2: How I can convert this values to angles or rotation matrix for static camera?
Q3: Does frames from camera and IMU synchronized?
Q4: Is it possible to read all frames from .bag (not realtime reading)
The text was updated successfully, but these errors were encountered: