Oculus Lipsync GDExtension #9718
Replies: 4 comments
-
It'd be really cool to have this functionality! And now that we support face tracking, I could imagine this being implemented such that it provides data that is compatible with face tracking. However, I don't know how hard it would be to convert the viseme's from this API, to the blend shapes used by the unified expressions standard (which is what |
Beta Was this translation helpful? Give feedback.
-
There's a clue about what's used in Neos/Resonite Neos-Metaverse/NeosPublic#2761 regarding the problem of a lack of linux version, and also request on the meta forums for it to be open sourced. https://communityforums.atmeta.com/t5/Unity-VR-Development/OVR-LipSync-Open-Source/m-p/1156537#M23711 |
Beta Was this translation helpful? Give feedback.
-
@dsnopek , I'm guessing you mean that the lipsinc library should expose its data as face tracked data? That does make a lot of sense and would mean an easy platform agnostic way to switch between full face tracking, and inferred face tracking through techniques like lipsincing. We do need to think of a way to merge face tracked data from different sources, we might get lip motion from a lip sinc extension, while getting eye motion from eye tracking (there are plenty of headsets that do eye tracking only). @goatchurchprime it would be great if a solution like this could be open sourced, that would make it available to many more platforms, but I don't see Meta doing that any time soon. Turning this into a GDExtension sounds like a good solution especially with Davids suggestion. I do love how the help page states |
Beta Was this translation helpful? Give feedback.
-
I got it working on Windows in my GDExtension in https://github.com/goatchurchprime/two-voip-godot-4/tree/ovrlipsync?tab=readme-ov-file#with-ovrlipsync Short demo video here: https://github.com/godotengine/godot-proposals/assets/677254/7a66456b-5e52-47c2-bed7-a83cc51a2024 Failed to get the Android version working (problems with deploying two .so files linking from one to another from one GDExtension as well as the libOVRLipSyncShim.a archive appearing to be corrupt). Attempts to decompile the DLL using retdec for the purpose of recompiling under linux ran out of RAM, and anyway the retdec developers say this sort of thing is impossible (why do they call it retargeting?). |
Beta Was this translation helpful? Give feedback.
-
This is a unique and a very widely delpoyed library by Oculus whose API dates back to 2015 which is used for transforming an audio stream into a series of visemes that can be used to animate the mouth of an avatar in realtime. API here: OVRLipSync.h.txt
See https://developer.oculus.com/downloads/package/oculus-lipsync-native/ for downloads
It comes precompiled for Unity and UE, but we don't have a working precompiled version for Godot on all platforms.
Because it is so widely used and effectively standardized, almost all avatar creation systems come with shape-keys to match its output visemes.
Given the age of this library and meta's support for Godot on the Quest VR platform, maybe it's time that this code is released open source, or else built as an easy-to-use GDExtension asset that plugs into an audio-bus.
Beta Was this translation helpful? Give feedback.
All reactions