VR/AR Graphics Optimization

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
VR/AR Graphics Optimization
Complex
from 3 business days to 2 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Graphics Optimization for VR/AR Devices

In VR you need to render a frame twice—for left and right eye. On Quest 2 that's 72 fps × 2 = 144 Draw Call packets per second. If your project hasn't gone through Single Pass Instanced Rendering and you have more than 100 batches per frame, stereoscopic scene simply won't fit in GPU time slot. Headaches and nausea in users—direct consequence of dropped frames.

AR projects on ARCore/ARKit add camera capture, plane detection, occlusion mesh on top. Device is already loaded with CPU tracking tasks before your shader makes first call.

Why Standard "Reduce Polygons" Advice Doesn't Work in XR

Polygon reduction is the last thing we do. First we look at what actually kills performance on XR devices.

Overdraw in mobile VR. On Adreno and Mali GPU overdraw costs disproportionately—tiled render doesn't like large number of semi-transparent objects on top of each other. Standard particle systems with Additive blending on HDR skybox—typical frame rate killer on Quest. Frame Debugger in Unity shows this instantly: look for red zones in overdraw view.

Foveated Rendering set up wrong or not enabled at all. Fixed Foveated Rendering on Meta XR SDK reduces GPU load by 15-30% without noticeable image quality loss—but only if right level chosen (Low/Medium/High) for specific content. In dynamic scenes with fast camera movement High level gives artifacts on periphery.

Single Pass Instanced doesn't work with custom shaders. If project has even one shader without UNITY_VERTEX_INPUT_INSTANCE_ID and UNITY_SETUP_INSTANCE_ID, entire render auto-fallback's to Multi Pass. This doubles load. Find via XR Plug-in Management → Rendering Stats.

How We Work with XR Projects

We start with profiling on target hardware—not in Editor, but on device. RenderDoc for Android, Xcode Instruments for iOS/visionOS, OVR Metrics Tool for Meta. Emulator won't show real memory and bandwidth delays.

Typical case from practice. Quest 3 project—architectural visualization, 8 rooms, PBR materials. First build: 45 fps center scene, 28 fps looking toward window. OVR Metrics Tool analysis showed 340 Draw Calls and 4 overdraw layers on window glass. Solution: GPU Instancing for repeating furniture (chairs × 24 → 1 Draw Call), replace glass with Standard Transparent with custom shader using Surface Type Opaque + alpha in clip, enable Fixed Foveated Rendering level Medium. Result: 72 fps stable, thermal throttling gone.

For AR projects we separately work on occlusion—AR Foundation Environment Depth requires correct depth setup in shaders, otherwise virtual objects "shine through" real surfaces.

Tools we use:

  • Unity Profiler + Frame Debugger (GPU Usage module)
  • RenderDoc (Android Vulkan/OpenGL ES)
  • Meta Quest Developer Hub + OVR Metrics Tool
  • XR Interaction Toolkit Profiling Guidelines
  • ARM Mobile Studio (Streamline) for Adreno/Mali deep-dive

We manually optimize Shader Graph—look at instruction count in Preview window, remove unnecessary sample operations, move computations from Fragment to Vertex where interpolation is acceptable.

Stages of XR Graphics Optimization Work

First we gather build in Release configuration and take baseline metrics: fps, GPU time per frame, Draw Calls, memory footprint. Without baseline impossible to assess result.

Then—scene audit: object hierarchy, unique material count, lighting setup (static/dynamic), real-time shadows presence (almost always forbidden on mobile VR), LOD groups.

After audit we prepare optimization plan with priority by impact/effort. Implementation goes in iterations with intermediate measurements—important not to lose baseline and understand what exactly gave improvement.

Final stage—thermal testing on device 20-30 minutes in heat conditions. Thermal throttling on mobile chips (Snapdragon XR2) starts earlier than expected.

Task Scale Estimated Timeline
Audit + report without fixes 2–4 days
One scene optimization (up to 500 objects) 1–2 weeks
Full project (5–15 scenes, custom shaders) 3–6 weeks
PC VR → standalone Quest porting 4–8 weeks

Cost calculated individually after project audit and target platform review.

Common XR Graphics Preparation Mistakes

Real-time shadows on mobile VR. Cascaded Shadow Maps with 4 cascades on Quest—guaranteed dropped frames. Replace with baked Lightmap or Blob Shadow (simple projector).

MSAA above 4x not disabled. On tiled GPU (Adreno) MSAA 8x breaks performance. In XR Project Settings set 4x maximum, in complex scenes—2x.

Textures without Mipmap. In VR objects can be different distances from camera simultaneously. Without Mipmap GPU takes full resolution for far objects—bandwidth grows for no reason.

Physics Colliders more complex than necessary. Mesh Collider on interior objects where Box/Capsule enough. In VR physics tick also affects frame time.