3D Diegetic UI Design for AR VR Games

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
3D Diegetic UI Design for AR VR Games
Complex
~1-2 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Design of 3D Diegetic UI for AR VR Games

Screen-space UI in VR — pain. Canvas in Overlay or Screen Space Camera mode renders over everything, doesn't participate in stereo projection properly, causes depth conflict for users. Classic HUD with HP bar in screen corner in VR reads as sticker on headset glass — destroys immersion in seconds.

Diegetic UI solves radically: interface becomes world part. Health shown on weapon display, map on wrist device, dialogue options appear as holograms before NPC. All these — objects in world space with position, depth, physics.

Technical problems transitioning to World Space Canvas

In Unity standard path — move Canvas to World Space mode and place in scene. Sounds simple. Practice — several rakes.

Text readability on scaling. World Space Canvas with TextMeshPro needs calibration: wrong Font Size to Canvas Scale ratio makes text either blurred (rasterized too fine) or needs huge polygon budget for SDF rendering over area. In VR text reads differently due IPD — must test in real headset, not Play Mode.

Occlusion and depth fighting. UI-panel positioned in space conflicts with scene geometry. Either use separate layer with custom depth write, or go transparent shader with correctly set Render Queue. Common error — UI falls behind walls or flickers on surface junction.

Interactivity through XR Interaction Toolkit. Standard Event System doesn't work with VR controllers out of box. Need TrackedDeviceGraphicRaycaster instead of GraphicRaycaster, configure XRUIInputModule and write hover/select logic for each interactive element. In XR Interaction Toolkit 2.x solved through IXRInteractable interface and XRSimpleInteractable on UI components, but requires explicit interaction layer setup.

How 3D interface design built

First — art-direction. Diegetic UI must fit organically in game world. Sci-fi shooter, fantasy RPG and medical VR simulator require fundamentally different visual logic.

Design starts with information architecture: what player always sees, what on request, what in specific context. Determines UI positioning in space: always-visible attached to hand or weapon, contextual appear in world on object interaction.

Then — model greyout in 3D editor (Blender or Maya) with exact sizes in meters. In VR measurement units matter: 0.3 × 0.2 meter panel 0.5 meters from player — one perception; same panel 1.5 meters — completely different.

Final assets export to FBX or glTF for engine import. For interactive panels — separate components for hover states, press animation, feedback visualization.

In AR (via AR Foundation + ARKit/ARCore) logic slightly different: UI must correctly anchor to AR-planes or objects via ARAnchor, account occlusion with real world (if depth occlusion enabled), work normally under unstable tracking.

Typical design errors for Diegetic UI

  • Too small font. In VR minimum comfortable size — around 0.5° angular. At 0.5 m distance about 4–5 mm symbol height in world units.
  • Elements too close to camera (< 0.3 m). Causes discomfort from vergence/accommodation conflict.
  • UI without physical feedback. In VR button press without tactile or visual response feels broken — even if technically works.
  • No state machine for elements. Button needs explicit states: Normal, Highlighted, Pressed, Disabled — with animated transitions.
Task Timeline
Design one UI block (wrist interface / panel) 3–6 work days
Complete Diegetic UI system for one scene 2–4 weeks
AR Foundation anchor integration and occlusion 1–2 weeks

Cost individually after requirements and target platform analysis.