Device Setup
Device settings can be uniformly applied across all scenes that require user motion tracking for interaction.
Meta Quest Pro/3
In the Hierarchy(e.g. MyClient and MyRoom), delete the existing Main Camera and EventSystem objects.
Add the XR Origin Hands (XR Rig) prefab to the scene. and set its position to the origin (0, 0, 0) before proceeding
Navigate to:
Assets/Samples/XR Interaction Toolkit/[version]/Hands Interaction Demo/Prefabs
Note:
Make sure the Hand Interaction Demo sample has been imported via the Package Manager > XR Interaction Toolkit.
To estimate the device's height using sensor data, set the Tracking Origin Mode to "Floor" in the XR Origin component of the XR Origin Hands (XR Rig).
Create an XR Interaction Manager object as a child of XR Origin Hands (XR Rig):
Go to
GameObject > XR > Interaction Manager
Create an Event System object:
Go to
GameObject > XR > UI Event SystemSet Layer to
DefaultRegister input actions as needed as below.

XREAL
In the Hierarchy, delete the default Main Camera and EventSystem objects.
Add the XR Interaction Hands Setup prefab to the scene. and set its position to the origin (0, 0, 0) before proceeding
From
Packages/com.xreal.xr/Runtime/Prefabs/
The Xreal SDK does not support measuring device height. You may need to implement this functionality as needed.
Configure hand tracking input actions:
Select the input action asset located at:
Assets/Samples/XR Interaction Toolkit/[version]/Starter Assets/XRI Default Input Actions.inputactionsIn the top menu, go to
XREAL > Setup Hand TrackingIn inspector, click the Assign Project-wide Input Actions button
Set the input source to use hand tracking:
Navigate to:
Edit > Project Settings > XR Plug-in Management > XREALSet Input Source to Hands

Apple Vision Pro
Add the XRI_SimpleRig prefab to the scene:
From:
Assets/Samples/XR Interaction Toolkit/[version]/visionOS/Prefabs/XRI_SimpleRig.prefab

Note: Ensure that the visionOS sample has been imported from the XR Interaction Toolkit via the Package Manager.
Create a Volume Camera in the scene:
Create an empty GameObject in the scene, and add the VolumeCamera(
Packages/com.unity.polyspatial/Runtime/Public/VolumeCamera.cs)component to it.Set the VolumeWindowConfiguration property on the VolumeCamera component as needed.
Note:
Canvas-based UI is best used with Bounded_VolumeCameraConfiguration. To use this configuration, you must install the Unity PolySpatial Samples package included with PolySpatial. The demo project (XRCollabDemo) comes with the required configuration pre-installed.
Switch the build target to visionOS:
Go to:
File > Build ProfilesSelect visionOS, then switch to it.
Select the XR plug-in provider:
Go to:
Edit > Project Settings > XR Plug-in ManagementSelect the visionOS (or visionPro) tab.
Check the Apple visionOS checkbox.

Select the RealityKit with PolySpatial
Go to:
Edit > Project Settings > XR Plug-in Management > Apple visionOSIn the App Mode dropdown, select RealityKit with PolySpatial.
Last updated