Game Interface Layout for Spatial Interaction

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Game Interface Layout for Spatial Interaction
Medium
~1-2 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

tags: [vr-ar]

Interface layout design for spatial game interaction

UI in VR is not a screen the user sees in front of them. It's objects in space the user interacts with hands or rays. Familiar Canvas in Screen Space won't work. World Space Canvas is the starting point, but complications begin there.

Standard mistake: take 2D game UI, switch Canvas to World Space, place in front of player — get panel that's either too close (hard to read in VR at 30 cm), too far, or doesn't respond to controllers.

Distance and angular size of elements

In VR everything is measured in angular degrees, not pixels or meters. Comfortable reading zone is 1.5–2 meters from user's eyes. At this distance font must occupy minimum 0.5° of field of view for confident reading — roughly 1.5 cm character height at 2 m distance.

Buttons require even larger size for confident ray interaction: minimum 3×3 cm physical at 2 meters — that's 0.86°. UI elements fine on monitor become unusable in VR.

Interface should be placed in "comfort zone" — horizontally within ±35° from gaze direction, vertically within ±15°. Panel on side or below requires constant head rotation and tires quickly.

XR Interaction Toolkit: Ray Interactor and Direct Interactor

XR Interaction Toolkit (XRI) is the standard for VR interactions in Unity. For UI it provides TrackedDeviceGraphicRaycaster (replacement for GraphicRaycaster for World Space Canvas) and XR Ray Interactor on controller.

Basic setup: Canvas → add TrackedDeviceGraphicRaycaster, remove standard GraphicRaycaster. On controller — XR Ray Interactor with LineType = StraightLine or ProjectileLine for more organic look. UI Press Threshold — ray interaction sensitivity, usually 0.1–0.2 for Trigger-based interaction.

Problem with EventSystem in XRI: standard StandaloneInputModule conflicts with XRUIInputModule. Scene should have only one EventSystem with XRUIInputModule. When importing XRI package this sometimes auto-configures, but with multiple canvases conflicts can arise — especially if part of UI was created before XRI integration.

World Space Canvas and texture resolution

World Space Canvas renders in world coordinates via CanvasScaler with Scale Factor. Canvas texture resolution depends on physical Canvas size × Reference Pixels Per Unit. Default Reference Pixels Per Unit = 100 — for 1×0.5 m Canvas that's 100×50 logical units. When rendered in VR this looks blurry.

Correct setup: CanvasScalerScale With Screen Size doesn't apply for World Space. Use Constant Physical Size with Physical Unit = Centimeters — gives predictable element size regardless of Canvas dimensions. Render text via TextMeshPro — its distance field rendering is significantly better than standard Unity Text when scaling in space.

Additionally: for World Space Canvas in VR must enable Pixel Perfect = false (breaks spatial rendering) and check Dynamic Pixels Per Unit in CanvasScaler — for VR usually need 2–4 value for sharp text.

Forearm interface: "wristwatch" pattern

One of most accepted patterns in VR UI is interface attached to non-dominant arm forearm. Player raises left hand, turns palm facing — panel with inventory, map or settings appears.

Implementation: Canvas as child of LeftHandController with local offset and rotation. Activation via Palm Up detection: take Vector3.Dot(leftHand.up, Camera.main.transform.forward) — when value above threshold (~0.7), hand turned palm to face, show UI.

Important: UI must appear smoothly (Lerp alpha + scale over 0.15–0.2 sec), else instant appearance right before eyes scares. And shouldn't appear accidentally during combat movements — need minimum threshold for pose hold time (0.3–0.5 sec).

Haptic feedback and visual confirmation

In VR there's no mouse click sound and no physical button press. Without feedback player doesn't know if they hit button with ray and pressed it.

Minimum set: visual hover state (button changes color when ray aims at it), visual pressed state (button "sinks" — Z scale by 0.05–0.1 units), haptic pulse on press via XRBaseController.SendHapticImpulse(0.3f, 0.05f). Three hundredths of second light vibration enough for confirmation. Without haptics buttons in VR feel "dead".

Interface type Estimated timeline
Basic World Space panel (Ray Interactor) 3–5 days
Set of UI components (menu, inventory, HUD) 1–3 weeks
Complex interface (wrist, gestures, zoning) 3–6 weeks

Cost calculated after design mockups and understanding interaction types in project.