-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a more convenient method of providing base station geometry. #2
Comments
The real Vive controllers and headset do it automatically, since they have a lot of sensors and know the exact geometry of all of those sensors, it must be quite easy to derive the positions of the base stations. The room calibration is primarily used to figure out the ground + play area, I suppose. The official sensors can determine the base stations relative positions without calibration or pre-set values. I think it would be possible for us to do this as well if we have multiple sensors on the quad with pre-calibrated positions |
"it must be quite easy to derive the positions of the base stations." Sure, if you use the solve PnP algorithm. otherwise it's a pain to do it from mathematical first-principals. Once the first stage "lock on" of the base station (light houses) has been acquired, the algorithm changes to just locating the device relative to the lighthouse and the previously defined "ground" plane reference during room setup. |
I'd be interested to ease the process by which we're to determine the positions and direction matrices. I got the OpenVR project working am able to inspect the m_rmat4DevicePose as well. The data from there (int[16]) didn't match format of our lightsources[2] (See Below) I was thinking i could build a little windows app that outputs the correct lighthouse calibration information (direction and position). i agreed the eeprom is a nice way to go. i'll look into something there as well. thanks for all your work and documentation. |
Yeah, as you can see the matrix there is 4x4 with the last column being
(0,0,0,1). For lightsource I'm using the top-left 3x3 sub-matrix (rotation)
and 1x3 position which is the last row of the 4x4 matrix above.
…--
Alexander Shtuchkin
On Thu, Dec 8, 2016 at 10:41 AM, matt pinner ***@***.***> wrote:
I'd be interested to ease the process by which we're to determine the
positions and direction matrices. I got the OpenVR project working am able
to inspect the m_rmat4DevicePose as well. The data from there (int[16])
didn't match format of our lightsources[2] (See Below)
I was thinking i could build a little windows app that outputs the correct
lighthouse calibration information (direction and position). i agreed the
eeprom is a nice way to go. i'll look into something there as well.
thanks for all your work and documentation.
[image: watched-lighthouse-calibration]
<https://cloud.githubusercontent.com/assets/314758/21022799/c5138470-bd32-11e6-87e3-a4fede3574bf.png>
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#2 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAmVHWPJAk2kF5v7gDUavi8cYM-eBeBfks5rGE88gaJpZM4KwnVk>
.
|
Super. this worked great. Thanks! all the HTTCC devices are in there. it is fascinating getting the positioning matrices for all the device. i wonder what it takes to build your own into the system? |
I updated the project to allow runtime configuration (with data stored in eeprom) and created a small app to easily get the base station geometry in needed format (https://github.com/ashtuchkin/vive-diy-position-sensor-geometry-getter). Will update the docs soon. |
Base station positions and direction matrices are hardcoded in geometry.cpp (
lightsources
).I'm currently using an OpenVR hello world sample on my main machine to get it (by inspecting m_rmat4DevicePose in debug mode). This is obviously not sustainable.
Subtasks:
The text was updated successfully, but these errors were encountered: