For this Made with MRTK project, I decided to revisit my love of space exploration and bring the educational experience into VR. I created my very own interactive Apollo 11 exhibit with 3D models courtesy of the Smithsonian 3D Digitization project and NASA.
- Unity 2020.3.37f1
- Meta Quest or Meta Quest 2
- Quest Link cable (to view on device via Unity in Play mode)
A demo of the project is available on YouTube. Check out the video here: https://youtu.be/H5a1kmKHmoo
Disclaimer: I have two disclaimers folks. First, you must create an Azure Speech Service to use the transcription feature that accompanies the Neil Armstrong and Command Module exhibits. New to Azure? No worries! You can sign-up for a free account by visiting: https://azure.microsoft.com/free/
Second, this experience has only been optimized to view via the Quest Link workflow (aka connecting the Meta Quest to your computer and pressing Play in Unity). Should you decide to deploy the experience to your device, you'll be responsible for modifying to better suit optimization for the device.
With the business out of the way, let's view the sample!
-
Ensure that you have Android Build Support and it's corresponding modules for Unity 2020.3.37f1.
-
Clone or download this repository and add the project to the Unity Hub.
-
Open the project in Unity.
-
The MRTK Project Configurator window will appear. Click Next. On the following screen click Apply. And on the next screen click Next.
-
On the next screen of the MRTK Project Configurator, click Import TMP Essentials. And then on the next screen click Done.
-
In the Project panel, navigate to Assets/Scenes and open the Exhibit scene.
-
Enable Developer Mode for your Meta Quest or Meta Quest 2. To do so, follow steps 1-7 within Connect Headset over USB. DO NOT BUILD AND RUN.
If your device is already in Developer Mode, then connect your device to your computer via the Quest Link cable, put on the headset, and click Allow when promoted to allow access to data.
After your device is connected to the computer and in Developer Mode, open the Quest Link app on the device. You'll likely be prompted to open Quest Link after completing the prior step to allow access to data.
-
Within Unity, in the Hierarchy, select the Exhibit-Descriptions GameObject.
-
In the Inspector, add your Speech Service Subscription Key and Speech Service Region for the Azure Speech resource that was mentioned in the disclaimer. You can find these two values within the Keys and Endpoint section of the Azure Portal for your Speech service resource. If you do not intend to use the transcription feature of the experience (pressing the red button to generate an audio clip playback for the text on the wall) then you can ignore this step.
-
Press Play. The Meta Quest/Meta Quest 2 will start the experience within Quest Link. If the Quest Link app is not already opened, the device will provide a pop-up link to the app for you to open.
You can teleport in the scene and grab the gloves that are against the wall. The red button on the wall can be pressed as well. You can use either the Quest controllers or your hands (if enabled for your device).
Have a question or issue trying the sample? Submit an issue to the repo!