Device Setup
Device settings can be uniformly applied across all scenes that require user motion tracking for interaction.
Meta Quest Pro/3
In the Hierarchy, delete the existing Main Camera and EventSystem objects.
Add the XR Origin Hands (XR Rig) prefab to the scene.
Navigate to:
Assets/Samples/XR Interaction Toolkit/[version]/Hands Interaction Demo/Prefabs
Note:
Make sure the Hand Interaction Demo sample has been imported via the Package Manager > XR Interaction Toolkit.
To estimate the device's height using sensor data, set the Tracking Origin Mode to "Floor" in the XR Origin component of the XR Origin Hands (XR Rig).
Create an XR Interaction Manager object as a child of XR Origin Hands (XR Rig):
Go to
GameObject > XR > Interaction Manager
Create an Event System object:
Go to
GameObject > XR > UI Event System
Set Layer to
Default
Register input actions as needed as below.

XREAL
In the Hierarchy, delete the default Main Camera and EventSystem objects.
Add the XR Interaction Hands Setup prefab to the scene:
From
Packages/com.xreal.xr/Runtime/Prefabs/
In the top menu, go to
XREAL > Setup Hand Tracking
Configure hand tracking input actions:
Select the input action asset located at:
Assets/Samples/XR Interaction Toolkit/[version]/Starter Assets/XRI Default Input Actions.inputactions
In inspector, click the Assign Project-wide Input Actions button
Set the input source to use hand tracking:
Navigate to:
Edit > Project Settings > XR Plug-in Management > XREAL
Set Input Source to Hands

Apple Vision Pro
Add the XRI_SimpleRig prefab to the scene:
From:
Assets/Samples/XR Interaction Toolkit/[version]/visionOS/Prefabs/XRI_SimpleRig.prefab

Note: Ensure that the visionOS sample has been imported from the XR Interaction Toolkit via the Package Manager.
Create a Volume Camera in the scene:
Create an empty GameObject in the scene, and add the VolumeCamera(
Packages/com.unity.polyspatial/Runtime/Public/VolumeCamera.cs
)component to it.Set the VolumeWindowConfiguration property on the VolumeCamera component as needed.
Note:
Canvas-based UI is best used with Bounded_VolumeCameraConfiguration. To use this configuration, you must install the Unity PolySpatial Samples package included with PolySpatial. The demo project (XRCollabDemo) comes with the required configuration pre-installed.
Switch the build target to visionOS:
Go to:
File > Build Profiles
Select visionOS, then switch to it.
Select the XR plug-in provider:
Go to:
Edit > Project Settings > XR Plug-in Management
Select the visionOS (or visionPro) tab.
Check the Apple visionOS checkbox.

Select the RealityKit with PolySpatial
Go to:
Edit > Project Settings > XR Plug-in Management > Apple visionOS
In the App Mode dropdown, select RealityKit with PolySpatial.
Last updated