Developing PBR materials for realistic graphics in AR
In AR object must exist in real world. Not simply overlay on camera image – literally exist: reflect same lighting as real objects nearby, have plausible shadows, react to ambient light changes as user moves.
This makes PBR-pipeline in AR fundamentally different task compared to VR or regular games.
Why PBR in AR doesn't work "out of the box"
In VR or regular game, lighting controlled by team. Artist knows light source, its color and intensity. Materials tune to this.
In AR, lighting is real world. ARKit (iOS) and ARCore (Android) provide Environment Probe and Light Estimation – data about color temperature, intensity and approximate main source direction. But these estimates, not exact data. ARCore Light Estimation accuracy sufficient so object doesn't look foreign, but insufficient for photorealistic matching.
Task of PBR-materials in AR – create physically correct base that works well across wide range of lighting conditions: from bright daylight through dimmed office to evening indoor. Material calibrated only for specific conditions looks wrong when environment changes.
Metallic vs. Specular workflow. In AR Foundation via URP, standard – Metallic workflow. But metallic surfaces critical to Environment Probe quality: high metallicity with poor or inaccurate Reflection Probe gives flat gray highlight, destroying realism. For AR better to limit metallic values to 0.7–0.85 for "metallic" surfaces instead of one and compensate with roughness.
Setting up materials for AR Foundation + URP
Reflection Probes in AR work differently: static cubemap doesn't match real environment. Correct approach – use AREnvironmentProbeManager from AR Foundation, which automatically generates cubemap from camera and updates Reflection Probe in scene. This requires proper setup in Unity: AREnvironmentProbeManager added to AR Session Origin, Automatic Placement enabled, Environment Texture Filter Mode – Trilinear for smooth interpolation.
Occlusion. For AR-object to correctly hide behind real objects – furniture, user hands – need Depth Occlusion via AROcclusionManager. This affects materials: objects with transparent or cutout materials require separate shader handling, otherwise occlusion mask applies incorrectly.
PBR-parameter configuration for AR:
- Albedo: neutral tones without additional dark or light bias. AR-engine self-corrects exposure – source should be neutral.
- Roughness: 0.3–0.6 for most materials. Too-smooth surfaces (roughness < 0.2) reveal Reflection Probe inaccuracy.
- Normal Map: moderate intensity. In AR, normal depth with incorrect lighting easily becomes artifact.
- Emission: used for self-illuminated elements independent from external lighting (screens, LED-indicators). In AR, emission often helps "ground" object visually.
Practice: material for mobile AR on Android
On Android via ARCore Light Estimation API available second-order spherical harmonic for ambient lighting. In Unity AR Foundation comes through ARLightEstimationData.mainLightDirection and ambientSphericalHarmonics. Custom URP ShaderGraph can accept this data directly and apply to material, giving more accurate ambient compared to standard URP Lit.
For iOS, ARKit provides Environment Texture (HDR cubemap) with updates every few frames – quality higher, latency present. iOS-project materials can tune more aggressively on specular.
Final PBR-material testing for AR must conduct on physical device in different lighting conditions: bright sunny day, artificial light, partial shade. Emulator never replaces this.
Timeline for PBR-material development: 2–4 hours for simple prop to 2–3 days for complex asset with multiple material zones and custom shader. Cost calculated individually.





