Mixed Reality Capture Promotional Video Rendering

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Mixed Reality Capture Promotional Video Rendering
Medium
~3-5 business days
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Mixed Reality Capture Promo Video Rendering

MRC video is when a real person in a VR headset appears inside a game world. Technically complex: you need to synchronize the position of the physical camera with the game scene in real time, overlay virtual content over a green background with correct depth, and ensure virtual objects properly occlude the player (foreground layers). The result is promotional material that sells the feeling of gameplay far better than standard headset camera recording.

Two approaches: hardware MRC and software MRC

Hardware MRC (for Meta Quest 2/3/Pro) uses a separate physical camera synchronized with the headset via Mixed Reality Capture App or custom setup with Elgato 4K60 Pro and OBS. Meta provides an official pipeline: the external camera connects to the PC, a special Quest application sends headset position via Wi-Fi, and the engine (Unity via OVRMixedReality or Unreal via Oculus MRC Plugin) renders the foreground object layer.

Key complexity is calibration. The physical camera must be precisely calibrated by position in space relative to Guardian boundary on Quest. Use OVRExternalComposition mode in Oculus PC SDK. A 5mm camera position error causes visible misalignment between virtual player hands and real hands in frame, creating a "floating gloves" effect that destroys immersion.

Software MRC involves post-production compositing. Record screen with alpha channel (if the engine supports it) plus green screen footage of the player. Then composite in After Effects or DaVinci Resolve Fusion. More flexible control over output, but no real-time synchronization—player poses and virtual content must be synchronized manually.

Foreground layer: the most challenging part

In hardware MRC, the main technical issue is the foreground layer. By default, the player renders on top of everything. But if the game has objects that should occlude the player (wall, table, virtual character), you need to output two layers: background virtual content plus foreground. Implemented via stencil buffer: foreground objects write to stencil, a separate pass renders the foreground mask.

In Unity with URP, this is configured via an additional Camera Output texture with custom Renderer Feature. The Render Pass draws only geometry tagged ForegroundLayer, writes the result to RenderTexture, which is then passed to the MRC compositor. Incorrect render pass order causes the player to "fall through" virtual objects, a classic artifact immediately visible in promo video.

Pipeline for final render

After raw footage capture: chroma key in After Effects (Keylight 1.2 + Screen Matte), color matching between virtual content and real scene lighting, motion blur on virtual layers to match real camera (Camera Blur effect with parameters from EXIF metadata).

Audio: if Quest controllers with haptics were used, haptic events are useful to convert into sound accents on the track, enhancing interactivity perception.

Export: H.264 or H.265, 4K 60fps for YouTube/Meta. For Steam trailer—H.264 per Valve requirements (maximum 30 fps for trailer thumbnail, 60 fps for gameplay video).

Capture format Estimated timeline
Software MRC, 1–2 minutes finished video 3–5 business days
Hardware MRC setup + capture + editing 5–10 business days
Complete trailer with MRC + gameplay footage + music 2–4 weeks

Cost is calculated after discussion of capture format, available equipment, and final video requirements.