Cross-Platform Support Across Various XR Devices

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Cross-Platform Support Across Various XR Devices
Complex
~2-4 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Cross-Platform XR Device Support

Writing VR game for Quest 3 is one task. Running same on Pico 4, HTC Vive XR Elite and PC via SteamVR without rewriting interactions—completely different. Differences in controller APIs, hand tracking systems, display resolutions and GPU performance make cross-platform XR one of most labor-intensive mobile game tasks.

Where cross-platform breaks in practice

Most painful point—input mapping. Meta Quest uses OVRInput with buttons PrimaryButton, SecondaryButton, PrimaryIndexTrigger. SteamVR (OpenVR) works via Action System with actions.json file. OpenXR unifies this via Input System with InputActionAsset, but specific button paths differ: /user/hand/left/input/trigger/value for OpenXR—still relatively standard, but grip/pose on different HMD has different physical meaning and position offset.

Second case—hand tracking. Quest hand tracked via OVR Hand API with joints OVRSkeleton.BoneId.Hand_Index1. Pico via SDK with different naming and different joint hierarchy count. XR Hands (package com.unity.xr.hands) solves this via XRHand abstraction with XRHandJointID, but device support depends on package version and native provider.

Third—spatial anchors and Scene Understanding. Meta Spatial Anchors API, ARKit Anchors, OpenXR Spatial Anchors Extension (XR_EXT_spatial_entity)—three different APIs with different maturity. If application uses real-world position saving, architecture needs anchor API abstraction from start, else refactoring later takes weeks.

Architectural approach to XR cross-platform

Foundation—OpenXR + Unity XR Plugin Management. OpenXR covers Quest (via Meta OpenXR), Pico (PicoXR OpenXR), SteamVR, Windows Mixed Reality. For each platform appropriate OpenXR Feature Package connects, but interaction logic stays common.

Stack:

  • XR Interaction Toolkit—high-level interactive components (XRGrabInteractable, XRRayInteractor, XRDirectInteractor)
  • Unity Input System with InputActionAsset—unified input mapping, separate binding for each platform
  • XR Hands—if hand tracking needed
  • AR Foundation—for AR features on mobile devices

Key pattern—Device Abstraction Layer: wrap all native SDK calls in interfaces (IHandTrackingProvider, IAnchorService, IHapticFeedback). Allows plugging platform implementations via DI without game logic changes.

From concrete case: in project with Quest 2/3 and Pico 4 support, haptic feedback problem arose—Meta OVR SDK supports vibration amplitude and frequency (OVRInput.SetControllerVibration(frequency, amplitude)), but standard OpenXR path via XRControllerWithRumble lacked needed precision on Meta. Solved via feature detection: on startup check Meta-specific OpenXR Extension XR_FB_haptic_amplitude_envelope, if available—use native path, else—standard OpenXR.

Testing on multiple devices

Full testing needs physical devices. But significant iteration portion covered via:

  • XR Device Simulator in Unity—basic logic check without HMD
  • Link/Air Link for Quest—PC mode for fast iteration cycles
  • OpenXR Runtime Switcher—toggle between SteamVR and Oculus runtime on PC for behavior comparison

Test matrix captures: HMD OS version, OpenXR runtime version, tracking mode (6DoF/3DoF), with/without hand tracking, performance (FPS, heat, GPU time).

Work stages

Target platform analysis. Define device list and features (hand tracking, passthrough, spatial anchors, multiplayer). Determines SDK choice and work volume.

Existing project audit (if codebase exists). Find platform-dependent calls, estimate refactoring volume.

Abstraction architecture. Design Device Abstraction Layer, choose specific packages and versions.

Implementation and Input Actions setup. Mapping for each platform, testing in XR Device Simulator.

Hardware testing. Run on each target HMD, profile GPU/CPU.

Task scope Estimated timeline
Quest project port to Pico (inputs only) 1–2 weeks
Quest + Pico + SteamVR support from scratch 1–2 months
Complete XR platform with hand tracking and anchors 2–4 months

Cost is determined individually after project audit and target device list.