Functional Testing of Controllers Across Headset Models

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Functional Testing of Controllers Across Headset Models
Medium
~3-5 business days
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

tags: [vr-ar]

Functional testing of controllers for different headset models

Touch Pro on Quest Pro, Touch Plus on Quest 3, Touch on Quest 2, Index Knuckles, Valve Index, Reverb G2 — each controller has different button set, axes, triggers and haptics. App developed for Quest 2, when running on Quest Pro unexpectedly loses grab functionality because Touch Pro has different layout and different haptic API.

Functional controller testing is not "checked on one headset, works". It's matrix coverage of all supported devices with documentation of each case.

Input mapping: where things most often break

OpenXR standardized input via XR Input Binding Profiles. In Unity with XR Interaction Toolkit and OpenXR Plugin all controller actions map via InputActionAsset with several binding profiles: Oculus Touch Controller Profile, Valve Index Controller Profile, HTC Vive Controller Profile, Windows Mixed Reality Controller Profile.

Common problem: binding configured for only one profile. Developer added <XRInputBinding path="/user/hand/right/input/trigger/value"> without specifying profile — binding applies to "default" controller and doesn't work on Index Knuckles or WMR controllers.

Correct setup: for each action in InputActionAsset add bindings for all supported profiles. XR Interaction Toolkit provides Default Input Actions asset with pre-configured multi-profile bindings — good starting point, but needs audit for specific project.

Meta Quest Pro specifics: Touch Pro controllers have additional sensors (stylus pointer, face buttons capacitive touch). If app uses OVRInput directly instead of OpenXR Actions, need explicitly handle OVRInput.Controller.TouchPro as separate type — else some buttons return wrong values.

Hand Tracking as alternative input

Quest 2, 3, Pro support Hand Tracking without controllers. If app claims hand tracking support, it must be tested separately — Hand Tracking has different input pipeline and different limitations.

XR Hands Package (com.unity.xr.hands) provides XRHandSubsystem with 26 joint data per hand. Gestures implemented via XRHandGesture components or custom logic. Problem: "pinch" gesture — primary interaction without controller — fires with delay and has false positives on fast finger movements.

Test cases for Hand Tracking differ from controller: check detection correctness in poor lighting, with overlapping hands, fast gestures. Document pinch false positive percentage during normal finger activity (typing, scratching nose — shouldn't trigger game actions).

Controller testing matrix

Functional testing conducted by matrix: devices × test cases. Minimum matrix for Quest-first project:

Test case Quest 2 Quest 3 Quest Pro Index
Trigger — grab object
Grip — hold
A/B/X/Y buttons
Thumbstick locomotion
Haptic feedback on interaction
Hand Tracking — pinch select
Edge cases (dead controller battery)

Each cell: Pass / Fail / Not Applicable + description on Fail. Living document, updated with each build.

Haptics: testing tactile feedback

Touch Pro and Touch Plus have TruTouch haptic system — more precise vibration with amplitude and frequency support. Touch on Quest 2 — basic vibration with single intensity parameter.

API differs: OVRHaptics for native Meta, XRBaseController.SendHapticImpulse(amplitude, duration) for OpenXR. On Touch Pro via Meta XR SDK available OVRInput.SetControllerVibration with extended parameters. Using only basic OpenXR haptic API, TruTouch on Pro controllers works as normal vibration without hardware advantages.

Test: compare haptics on Quest 2 and Quest 3/Pro on same game events. Sword hit — different intensity? Expected. Haptics completely absent on one device? Bug, fix it.

Timeline and work format

Functional controller testing requires physical device access. If client has no test headsets — discuss using our devices or rental.

Testing scope Estimated timeline
One headset, basic matrix 2–4 days
2–3 headset models, full matrix 1–2 weeks
Full multi-platform testing 2–4 weeks

Work result — test matrix with documented Fail cases, priorities and fix recommendations. Cost calculated after receiving supported devices list and functionality scope.