Plane Tracking and Object Anchoring Setup in AR Games

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Plane Tracking and Object Anchoring Setup in AR Games
Medium
~3-5 business days
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Setting up plane tracking and object anchoring in AR games

AR Foundation in Unity provides access to ARPlaneManager and ARAnchorManager – and here begin most problems. Plane detected, object placed, player moves – and after three steps virtual table hanging half-meter above surface. Or worse: ARPlane updated, but anchor (ARAnchor) stayed at old coordinates, object begins "drifting" through room.

Not AR Foundation bug. Incorrect scene architecture.

Why anchors detach from planes

Main mistake – attaching object directly to ARPlane.transform, not to ARAnchor bound to this plane. ARPlane updates its geometry each tracking refresh: center shifts, normal recalculates. Object attached as child to plane simply follows this chaos.

Correct chain: call ARAnchorManager.AttachAnchor(plane, pose), get ARAnchor with its own stable transform, attach game object to that. ARAnchor system updates independently from plane geometry – position corrected with IMU-data and space map re-estimation, not just recalculated with plane mesh.

Second problem level – TrackingState. Anchor can enter TrackingState.Limited or even None if device lost visual markers. Need subscribing to ARAnchorManager.anchorsChanged and handling state transitions: hide object, show "tracking lost" indicator, smoothly recover position when TrackingState returns to Tracking. Most projects don't – user sees frozen object at last known position, unresponsive to movement.

Plane detection: what actually works

ARKit on iOS and ARCore on Android use different plane detection algorithms, visible in behavior. ARKit more aggressive extends planes – even partial surface visibility, it reconstructs hypothetical boundaries. ARCore more conservative: plane grows only where real feature point coverage.

For game scenarios this means: if game requires placing object before user walked full surface, Android needs either lowering minimum plane size in ARPlaneManager.requestedDetectionMode, or implementing forced placement with visual warning. Using same detection thresholds for both platforms – guaranteed bad UX on one.

Separate topic – vertical planes (PlaneDetectionMode.Vertical). On most Android devices vertical tracking significantly less stable than horizontal. Games assuming objects placement on walls must test on real hardware, not ARCore Emulator – simulator doesn't reproduce IMU noise and real drift.

How we do this in practice

Typical case: mobile AR-game where player places towers on room floor. Main problem at prototype stage – towers "floated" on camera movement. Reason – objects attached to ARPlane via SetParent, tracking plane updated mesh, child objects shifted with new center.

Solution: transition to ARAnchor-architecture. Each tower on placement gets own anchor via AttachAnchor. Anchor added to dictionary Dictionary<ARAnchor, TowerObject> – when anchorsChanged.removed triggers, tower gets notified, transitions to "unreliable" state with visual indicator. On added (if known anchor from saved session) – recovers.

For saving anchors between sessions use ARWorldMap (ARKit) or Cloud Anchors API (ARCore/ARKit via AR Foundation Samples). Cloud Anchors let anchor survive app restart and even work between different devices – foundation for multiplayer AR.

Work stages

  1. Audit current architecture – check how objects attached to planes, TrackingState handling, scene behavior on tracking loss
  2. Configure ARPlaneManager – detection modes, minimum size filtering, plane visualization for debug
  3. Implement ARAnchor-architecture – transition from parent-child to anchors, anchor lifecycle handling
  4. Test on target devices – ARKit (iPhone 12+) and ARCore (flagships + mid-range Android)
  5. Optional Cloud Anchors integration – if persistence between sessions needed
Task scale Estimated timeline
Audit + fix existing architecture 2–5 days
Setup from scratch for new scene 3–7 days
Cloud Anchors + multiplayer integration 2–4 weeks

Typical errors on setup

Not handling LimitedTrackingReason.ExcessiveMotion. On fast camera movement ARCore goes Limited, objects start "jumping." Correct response – freeze object position during tracking loss, don't recalculate physics, smoothly restore via Lerp after return to Tracking.

ARRaycastManager used without hit type check. ARRaycastHit.trackable can be ARPlane or ARPoint (feature point). For surface object placement need filtering only TrackableType.PlaneWithinPolygon – else user "places" objects in air on feature points.

Too aggressive ARPlane mesh in production. Debug showing all plane geometry convenient, but final game either hides plane mesh completely or replaces with decorative variant. Standard ARFeatheredPlaneMeshVisualizer from AR Foundation Samples – good starting point.