You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Godot version: 3.0-stable, custom build from 9bd4026 OS/device including version: Gentoo Linux 4.9.72, KDE Plasma 5.10.5 GPU model and drivers: NVidia GTX 1080 w/proprietary drivers 387.22
Background:
For AR/VR projects, UI has to be in world space. The way to achieve this as demonstrated by the example projects is to render a viewport onto a plane and to translate raycast-based world space input into viewport coordinates.
4. If no one wanted the event so far, and a Camera is assigned to the Viewport, a ray to the physics world (in the ray direction from the click) will be cast. If this ray hits an object, it will call the CollisionObject._input_event() function in the relevant physics object (bodies receive this callback by default, but areas do not. This can be configured through Area properties).
Issue description:
The CollisionShape's _area_input signal for mouse and touch input is emitted only when the mouse/touch point moves, but not when movement of the camera or viewport plane causes the mouse/touch point to end up at another location on the viewport.
Use cases in which this may happen could include:
UI is faded in before the camera stops moving from a cutscene
The UI plane moves in response to mouse hover events over the UI (think rollouts on screen borders such as the tablet in FNAF)
The UI is used in VR, where the camera is never static and the user is likely to do head movements even while interacting with the UI
I believe the position of a CollisionShape hit by the mouse/touch point should be re-checked once per frame and a new _area_input signal emitted if the raycast makes contact at a different position (or even hits a different collider altogether).
The Area node in the reproduction project has a property 'CameraOrPlaneCanMove'. Turn if off to see the issue. Turn it on to generate synthetic mouse events (which leads to another issue: button press events are not forwarded correctly).
The text was updated successfully, but these errors were encountered:
I've updated the reproduction project. The example can now generate synthetic InputEventMouseMotions and send them to the Viewport from _process() to keep the mouse position up to date while UI plane moves around.
This emulates the suggested behavior, but at the same time breaks the button press event for unknown reasons.
This should work now without any hacks. Just forward the event as soon as mouse moves and it will work out of the box. No need to re-send events on every frame or similar.
Everything works, except mouse capture. If you click on a button and hold down the mouse, the button will be shown in its"down" state when the mouse cursor is over the button at its screen position at the time of pushing, even if the button is now at a completely different screen position.
Apart from that, an amazing improvement and an early Christmas present. Thank you!
Godot version: 3.0-stable, custom build from 9bd4026
OS/device including version: Gentoo Linux 4.9.72, KDE Plasma 5.10.5
GPU model and drivers: NVidia GTX 1080 w/proprietary drivers 387.22
Background:
For AR/VR projects, UI has to be in world space. The way to achieve this as demonstrated by the example projects is to render a viewport onto a plane and to translate raycast-based world space input into viewport coordinates.
As described in the InputEvent Documentation:
4. If no one wanted the event so far, and a Camera is assigned to the Viewport, a ray to the physics world (in the ray direction from the click) will be cast. If this ray hits an object, it will call the CollisionObject._input_event() function in the relevant physics object (bodies receive this callback by default, but areas do not. This can be configured through Area properties).
Issue description:
The
CollisionShape
's_area_input
signal for mouse and touch input is emitted only when the mouse/touch point moves, but not when movement of the camera or viewport plane causes the mouse/touch point to end up at another location on the viewport.Use cases in which this may happen could include:
I believe the position of a
CollisionShape
hit by the mouse/touch point should be re-checked once per frame and a new_area_input
signal emitted if the raycast makes contact at a different position (or even hits a different collider altogether).Minimal reproduction project:
Moving_3D_UI.zip (version 3)
The
Area
node in the reproduction project has a property 'CameraOrPlaneCanMove'. Turn if off to see the issue. Turn it on to generate synthetic mouse events (which leads to another issue: button press events are not forwarded correctly).The text was updated successfully, but these errors were encountered: