Multiplayer Module Development for Shared VR Experience

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Multiplayer Module Development for Shared VR Experience
Complex
~2-4 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

tags: [vr-ar]

Developing multiplayer modules for shared VR experience

Multiplayer in VR is not just synchronization of positions and states. It's synchronization of two hands, head, gaze, grabs, physical objects — everything simultaneously, with latency not exceeding 50–80 ms, otherwise users see "jerky" avatars and lose the sense of shared presence. And all this must work on Quest's mobile hardware with limited bandwidth.

Ready-made networking solutions — Photon Fusion, Photon PUN2, Mirror, Netcode for GameObjects (NGO) — each has its own compromises for VR. Choosing the network stack at the start determines architectural decisions for the whole project.

Photon Fusion and VR: why Server Mode is better than Shared Mode

Photon Fusion supports two modes: Shared Mode (peer-to-peer with one host) and Server Mode (dedicated Photon server). For VR, Server Mode is preferable, even for small projects.

In Shared Mode one player is the host, and all traffic goes through them. In VR this means: if the host moves hands (doing so 72–90 times per second), their own data is processed locally, and other players receive it with host's RTT. With unstable host connection the entire session suffers. In Server Mode there's no single point of failure — Photon Cloud handles relay and authoritative state processing.

Specific setup: NetworkRunner with GameMode.Server, FixedUpdateNetwork instead of Update for deterministic physics. For hand transforms use NetworkTransform with InterpolationDataSource.Predicted — client-side prediction reduces perceived latency.

Avatar synchronization: IK and hiding own body

The main technical problem of VR avatars in multiplayer — other players have a full body avatar with animation, the local player doesn't (they see only hands from first person). Synchronize head and two hand positions — and from these three points restore body pose for other players.

Solution — Full Body IK with limited control points. In Unity this is Animation Rigging package with TwoBoneIKConstraint for arms and MultiParentConstraint for torso. Head (HMD position) → torso via heuristic offset down (~0.3 m) → arms via IK to controller positions. Not physically accurate, but convincing looking with normal movements.

Network traffic: three transforms (head + 2 arms) × 7 floats (pos + rot) × 90 fps = ~7.5 KB/s per player without compression. With NetworkTransform and quantization in Photon Fusion — 1.5–2 KB/s. With 4–8 players — manageable.

Physical object synchronization

Picked-up items, thrown objects, doors — all are physical rigid bodies. Fundamental problem: physics simulates independently on two players, and results diverge. One player throws a cube at the wall — for them it bounced right, for the second — left.

Approach 1: authoritative physics on server. All Rigidbody simulated only on StateAuthority (in Photon Fusion terms — whoever grabbed the object). Other players interpolate position. On object grab it's "transferred" to new owner via RequestStateAuthority. Downside — teleportation visible during transfer if positions diverged.

Approach 2: client physics with reconciliation. Each client simulates physics locally, server periodically broadcasts authoritative state. On divergence above threshold — soft shift to authoritative position via Lerp. Looks better, but harder to implement without artifacts.

In practice for VR games with physical interactions, approach 1 is used with added ghost object — thin semi-transparent copy showing authoritative position while main mesh interpolates.

Development process

Multiplayer module is a separate task best laid out at architecture stage, not added to ready single-player game. Retrofitting multiplayer to finished single-player VR project — usually +50–70% effort compared to multiplayer-first development.

Stages: choose network stack and architecture → basic transform synchronization → avatars with IK → physical object synchronization → game logic (scores, states, rounds) → load testing → traffic optimization.

Task scope Estimated timeline
Basic multiplayer (2–4 players, transforms only) 2–4 weeks
Full avatars with IK + object physics 6–10 weeks
Large-scale multiplayer (8+ players, custom logic) 3–6 months

Cost calculated after requirements analysis: number of players, interaction types, platform choice (Quest standalone, PCVR, cross-platform).