VR/AR Development Services for Games & Applications

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 30 of 83 servicesAll 242 services
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

VR and AR Development

The first time a team launches a project in a VR headset, they face a common issue: technically everything works, but users get motion sickness, hand tracking lags, or the scene judders at the edges of vision. These aren't bugs in the traditional sense—they result from the fact that VR/AR development requires a fundamentally different approach to rendering, interaction, and UX from the project's inception.

Platforms and SDKs

We work with the entire current stack:

Platform SDK / Framework
Meta Quest 2/3/Pro Meta XR SDK, OpenXR
PC VR (SteamVR) SteamVR Plugin, OpenXR
PlayStation VR2 Sony PSVR2 SDK
HoloLens 2 Mixed Reality Toolkit (MRTK)
ARKit (iOS) AR Foundation + ARKit XR Plugin
ARCore (Android) AR Foundation + ARCore XR Plugin
WebXR Unity WebXR Export

OpenXR is our baseline wherever possible—it provides cross-platform compatibility between Meta, Valve Index, HP Reverb, and other PC VR devices. On top of OpenXR, we build with XR Interaction Toolkit (Unity) or VR Expansion Plugin (Unreal).

VR Interaction: Grab, Throw, Teleport

This is the most underestimated part of VR development. Clients often see it as "just hand animation," but it's actually a complex system where physical correctness, responsiveness, and comfort conflict. Let's break it down.

Grab (Object Picking)

XR Interaction Toolkit provides three Interactable types for grabbing:

  • XRGrabInteractable — standard grab; object follows the controller via physics joint or direct position/rotation
  • XRSimpleInteractable — for objects without physical movement (buttons, levers)
  • Custom Interactable via inheritance from XRBaseInteractable

The key decision when implementing grab: Kinematic vs Physics-based movement:

Kinematic (trackPosition/trackRotation via Transform): object instantly follows the hand. Responsive but unrealistic—objects pass through walls. Works for most casual VR and experience projects.

Physics-based (via Rigidbody + Joint): object is held by a physics joint. Realistic interaction with the environment; objects properly collide with tables and walls. The problem: with fast movements, the joint can "stretch," causing jitter or objects flying out of hands. Fixed via velocity damping, max joint force, and a breakage detector for extreme speeds.

Attach Transform — often overlooked. Each Interactable needs a properly configured Attach Transform (the point where the hand "sticks"). Without it, a gun grip ends up at the center of the mesh instead of where it should be held.

For two-handed weapons and tools—a separate TwoHandGrab system: the lead hand controls position, the second controls orientation. XR Interaction Toolkit supports this via XRTwoHandGrabInteractable or custom logic with two Attach Points.

Throw (Throwing)

Physically correct throwing in VR is non-trivial. The issue: Rigidbody.velocity at controller release reflects the instantaneous velocity, which is often incorrect due to tracking discretization. The user flicks their wrist—but the object flies half as fast as expected.

Solution: velocity smoothing over the last N frames (typically 5-10 frames, ~80-160 ms at 60 Hz) before release. XR Interaction Toolkit does this via VelocityEstimator. Additionally apply velocity scaling multiplier — a small speedup (1.2–1.5x) makes throws subjectively more satisfying.

Angular velocity (for objects that should spin in flight) is smoothed the same way.

Teleport (Movement)

Locomotion is the primary cause of motion sickness for inexperienced VR users. Teleportation is the standard navigation method when smooth movement is undesirable.

Components from XR Interaction Toolkit: TeleportationArea, TeleportationAnchor, TeleportationProvider. Basic implementation works out-of-the-box, but for production we refine:

  • Teleport arc (XRRayInteractor with Bend Ray): arc looks more natural than a straight ray and is better understood by users
  • Valid landing zone: visual indicator changes color on obstacles—red/green
  • Fade transition: smooth screen fade (black fade) before teleport reduces disorientation
  • Rotation snapping: after teleport, offer snap-rotation at 45° or 90° instead of smooth—reduces motion sickness risk

For projects requiring smooth locomotion (action games, simulators), use comfort settings: vignette during movement, reduced FOV during acceleration. Settings are exposed to users in the menu—different people have different sensitivity thresholds.

AR: Plane Tracking and Environmental Interaction

AR introduces a different class of problems—working with real, unpredictable environments.

AR Foundation — a cross-platform layer over ARKit and ARCore. Most basic features (plane detection, raycasting, image tracking, face tracking) are available through a single API.

Plane Detection

ARPlaneManager detects horizontal and vertical planes. Practical considerations:

  • Initialization takes time — the user must scan the room while the system builds a map. Explicit onboarding is needed with instructions like "slowly sweep the camera across surfaces"
  • Planes are unstable — their bounds and position update as data accumulates. Objects placed on a plane must be parented to the ARPlane via parenting, not world coordinates
  • Plane merging — two detected floor segments can merge into one, shifting the anchor. For critical anchors, use ARAnchor instead of direct plane attachment

Image Tracking and Object Tracking

ARTrackedImageManager — for markers. Important: tracking quality depends directly on reference image quality. Images with high detail frequency and contrasting edges (think: QR code, but polished) track more reliably than smooth logos.

ARCore Geospatial API — for outdoor AR with real-world coordinate binding. Uses VPS (Visual Positioning System) based on Google street data. Accuracy up to 10 cm in well-mapped areas.

VR Optimization: Frame Rate and Comfort

VR demands a stable high frame rate. Frame rate drops below the target feel far more uncomfortable to users than in regular games.

Device Target Hz Critical Threshold
Meta Quest 2 72 / 90 Hz < 72 Hz — noticeable
Meta Quest 3 90 / 120 Hz < 90 Hz — noticeable
Valve Index 90 / 120 / 144 Hz < 90 Hz — noticeable
PSVR2 90 / 120 Hz < 90 Hz — noticeable

Single Pass Instanced Rendering

The main rendering optimization for VR. Without it, the scene renders twice (once per eye), doubling draw calls. Single Pass Instanced renders both eyes in one pass via instancing: geometry is processed once, the shader gets two view/projection matrices via GPU instancing.

Enabled in Unity via XR Plug-in Management > Rendering Mode: Single Pass Instanced. Important: custom shaders must support SPI—standard URP/HDRP shaders do, custom HLSL requires fixes (UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX and related macros).

Foveated Rendering

On Meta Quest, Fixed Foveated Rendering (FFR) is available—reduced resolution at the frame's periphery where visual acuity is lower. Configured via OVRManager or Meta XR SDK:

OVRManager.fixedFoveatedRenderingLevel = OVRManager.FixedFoveatedRenderingLevel.High;
OVRManager.useDynamicFixedFoveatedRendering = true;

Dynamic FFR automatically raises the level if frame rate drops—more convenient than fixed in scenes with variable load.

IPD and Comfort Settings

IPD (Inter-Pupillary Distance) — the distance between pupils; affects depth perception and comfort during extended wear. At the software level, most devices only allow reading IPD (OVRPlugin.GetSystemDisplayFrequency); physical adjustment is on the headset. For applications requiring precise positioning (medical simulators, training) we account for IPD in scale calculations.

Haptics

Tactile feedback is underrated. Even simple vibration feedback on object grab or hit significantly increases the sense of presence.

XR Haptics via OpenXR:

var hapticImpulse = new UnityEngine.XR.HapticCapabilities();
InputDevice device = InputDevices.GetDeviceAtXRNode(XRNode.RightHand);
device.SendHapticImpulse(0, amplitude: 0.5f, duration: 0.1f);

For complex patterns (tactile "texture" when touching a surface, ramping vibration when drawing a bowstring), use Meta Haptics Studio — lets you design haptic clips visually.

What Affects Cost and Timeline

VR/AR projects cost more than regular games of similar scope for several reasons:

  • Slower iteration — every change must be tested in the headset; emulation doesn't capture real experience
  • Motion sickness — some conceptual decisions need rework after first in-hardware playtesting
  • Optimization consumes significant time, especially for mobile VR (Quest)
  • QA requires physical hardware; bugs can't be reproduced in screenshots

For Quest projects, we begin optimization in the first sprint, not at the end—retrofitting VR optimization to a finished project costs far more than proper architecture from the start.