Device Setup

Device settings can be uniformly applied across all scenes that require user motion tracking for interaction.

Meta Quest Pro/3

  1. In the Hierarchy, delete the existing Main Camera and EventSystem objects.

  2. Add the XR Origin Hands (XR Rig) prefab to the scene.

    1. Navigate to: Assets/Samples/XR Interaction Toolkit/[version]/Hands Interaction Demo/Prefabs

Note:

  • Make sure the Hand Interaction Demo sample has been imported via the Package Manager > XR Interaction Toolkit.

  • To estimate the device's height using sensor data, set the Tracking Origin Mode to "Floor" in the XR Origin component of the XR Origin Hands (XR Rig).

  1. Create an XR Interaction Manager object as a child of XR Origin Hands (XR Rig):

    1. Go to GameObject > XR > Interaction Manager

  2. Create an Event System object:

    1. Go to GameObject > XR > UI Event System

    2. Set Layer to Default

    3. Register input actions as needed as below.


XREAL

  1. In the Hierarchy, delete the default Main Camera and EventSystem objects.

  2. Add the XR Interaction Hands Setup prefab to the scene:

    1. From Packages/com.xreal.xr/Runtime/Prefabs/

  3. In the top menu, go toXREAL > Setup Hand Tracking

  4. Configure hand tracking input actions:

    1. Select the input action asset located at: Assets/Samples/XR Interaction Toolkit/[version]/Starter Assets/XRI Default Input Actions.inputactions

    2. In inspector, click the Assign Project-wide Input Actions button

  5. Set the input source to use hand tracking:

    1. Navigate to: Edit > Project Settings > XR Plug-in Management > XREAL

    2. Set Input Source to Hands


Apple Vision Pro

  1. Add the XRI_SimpleRig prefab to the scene:

    1. From: Assets/Samples/XR Interaction Toolkit/[version]/visionOS/Prefabs/XRI_SimpleRig.prefab

Note: Ensure that the visionOS sample has been imported from the XR Interaction Toolkit via the Package Manager.

  1. Create a Volume Camera in the scene:

    1. Create an empty GameObject in the scene, and add the VolumeCamera(Packages/com.unity.polyspatial/Runtime/Public/VolumeCamera.cs )component to it.

    2. Set the VolumeWindowConfiguration property on the VolumeCamera component as needed.

    Note:

    Canvas-based UI is best used with Bounded_VolumeCameraConfiguration. To use this configuration, you must install the Unity PolySpatial Samples package included with PolySpatial. The demo project (XRCollabDemo) comes with the required configuration pre-installed.

  2. Switch the build target to visionOS:

    1. Go to: File > Build Profiles

    2. Select visionOS, then switch to it.

  3. Select the XR plug-in provider:

    1. Go to: Edit > Project Settings > XR Plug-in Management

    2. Select the visionOS (or visionPro) tab.

    3. Check the Apple visionOS checkbox.

  1. Select the RealityKit with PolySpatial

    1. Go to: Edit > Project Settings > XR Plug-in Management > Apple visionOS

    2. In the App Mode dropdown, select RealityKit with PolySpatial.

Last updated