AI Virtual Furniture Try-On in Mobile Apps
Placing a virtual sofa in a user's room is three linked tasks: detect horizontal floor plane, place 3D model with correct scale and lighting, and make furniture look part of the room, not a pasted image.
Plane Detection and Positioning
On iOS it's ARKit + ARPlaneDetection.horizontal. ARKit 4+ detects plane in 1–3 seconds on well-textured floor. Problem arises with uniform surfaces: white carpet, plain dark parquet — detection takes longer or doesn't work at all.
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
config.environmentTexturing = .automatic // for realistic material reflections
sceneView.session.run(config, options: [.resetTracking, .removeExistingAnchors])
// Delegate
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let planeAnchor = anchor as? ARPlaneAnchor,
planeAnchor.alignment == .horizontal else { return }
DispatchQueue.main.async {
self.placeFurnitureNode(on: planeAnchor, parentNode: node)
}
}
On Android — ARCore Plane.Type.HORIZONTAL_UPWARD_FACING. Logic is similar, but Android detection is less stable due to hardware fragmentation: Qualcomm Snapdragon works well, MediaTek mid-range devices often have plane jitter.
3D Furniture Models: Formats and Optimization
Catalog models usually arrive as OBJ or FBX with tens of thousands of polygons and 4K textures. For mobile AR this is unacceptable — rendering will tank on 2020–2021 devices.
Mobile AR optimization:
| Parameter | Source | Target AR Variant |
|---|---|---|
| Polygons | 50,000–200,000 | 5,000–15,000 |
| Textures | 4K (4096×4096) | 1K–2K (1024–2048) |
| Materials | PBR multi-layer | PBR single-layer |
| Format | OBJ/FBX | USDZ (iOS), glTF (Android) |
On iOS native format is USDZ, rendered via RealityKit or SceneKit. On Android — glTF 2.0, rendered via Filament (used in Sceneform and ARCore) or custom OpenGL/Vulkan renderer.
Lighting: Why Furniture Looks "Plastic"
Main reason for unrealistic appearance — lighting mismatch between AR object and real environment. ARKit solves this via environmentTexturing = .automatic: system builds environment map from camera and uses it for Image-Based Lighting (IBL) on PBR materials.
On RealityKit this works automatically. On SceneKit explicitly pass environment map:
sceneView.scene.lightingEnvironment.contents = sceneView.session.currentFrame?.capturedImage
sceneView.scene.lightingEnvironment.intensity = 1.0
For advanced option use ARDirectionalLightEstimate — ARKit estimates main light source direction in scene. Furniture shadow falls in same direction as real object shadows — makes try-on convincing.
Gestures: Move, Rotate, Scale
User must move furniture on floor, rotate, and resize. Standard gesture set:
- Pan gesture — movement: ray cast from touch point on ARPlane, move node to intersection point
- Rotation gesture (two fingers) — rotate around vertical axis
- Pinch gesture — scale, but with limited range (0.5x–2.0x real size)
@objc func handlePan(_ gesture: UIPanGestureRecognizer) {
let location = gesture.location(in: sceneView)
// Ray cast on ARPlane, not entire world
let results = sceneView.raycastQuery(from: location,
allowing: .existingPlaneGeometry,
alignment: .horizontal)
.flatMap { sceneView.session.raycast($0) }
if let result = results.first {
furnitureNode.simdWorldPosition = result.worldTransform.columns.3.xyz
}
}
AI Component: Interior Style Matching
"AI" in service name — recommendation system: app analyzes room color palette through AVCaptureSession and suggests furniture that matches stylistically. Technically — dominant color clustering via k-means or ready API (Google Vision Dominant Colors), then catalog matching by color attributes.
More complex option — CoreML model classifying interior style (Scandinavian, loft, classic) and filtering catalog by compatible collections.
Workflow
Catalog audit: model formats, SKU count, geometry and texture optimization needed.
AR layer implementation: plane detection, model placement, control gestures.
Lighting and shadow tuning for realistic appearance.
Catalog integration: load models over network or preload popular items set.
Optional: AI matching by room color palette.
Timeline Estimates
Basic AR try-on for iOS with ready USDZ models — 1 week. Cross-platform implementation with catalog optimization, control gestures, and color matching — 2–4 weeks. Cost depends on catalog size and 3D model conversion necessity.







