Custom Post-Processing Shader Development

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Custom Post-Processing Shader Development
Complex
from 2 business days to 2 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    663
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    859
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Custom Post-Processing Shader Development for Graphics

Built-in Volume effects in URP and HDRP — Bloom, Color Grading, Vignette — cover basic needs, but as soon as a non-standard visual effect is required (heat haze over asphalt, custom curve chromatic aberration, pixel art posterization, outline detection by depth buffer) — you enter the territory of custom Renderer Features and fullscreen Blit Shaders.

In Unity 2022+, the architecture of custom post-processing has changed. The old approach through OnRenderImage and CommandBuffer.Blit is obsolete. The new standard is ScriptableRendererFeature + ScriptableRenderPass with Blitter.BlitCameraTexture.

How ScriptableRendererFeature Works in URP

ScriptableRendererFeature is the entry point for adding a custom render pass to the URP pipeline. It registers ScriptableRenderPass, which executes at a specified RenderPassEvent (BeforeRenderingPostProcessing, AfterRenderingPostProcessing, etc.).

Inside ScriptableRenderPass.Execute(), the correct pattern in Unity 6 / URP 17:

Blitter.BlitCameraTexture(cmd, source, destination, material, passIndex);

This replaces the old cmd.Blit(src, dst, mat). The difference is not just syntactic: Blitter.BlitCameraTexture correctly handles VR single-pass instanced rendering and avoids UV flip artefacts in WebGL.

Material for Blit uses Full Screen Shader Graph (Fullscreen Shader type in URP ShaderGraph). It has URP Sample Buffer node — it reads Color, Depth, Normal, or Motion Vectors from G-Buffer. This is key: without this node, you cannot make depth-oriented effects (outline by depth, custom depth of field) through ShaderGraph without writing HLSL.

Outline by Depth and Normal

Outline detection is a common request for stylized games. Naive approach: outline through a second pass with enlarged vertices — works for solid objects, breaks on complex geometry, and does not give outline by scene silhouette.

The correct approach is fullscreen edge detection through Depth and Normal buffers:

Depth-based edge: read depth in the current pixel and neighbors (4 or 8 neighbors through Sample Texture 2D with offset). If depth difference exceeds threshold — pixel is on edge. Roberts Cross or Sobel operator — standard filters. In ShaderGraph: four Sample Texture nodes with manual UV offsets (+1/-1 pixel through Texel Size node), compute gradient magnitude.

Normal-based edge: similarly by Normal buffer. Gives lines where geometry sharply changes normal direction — these are edges and surface folds that depth edge does not see. Combination of depth + normal edge is the standard for cel-shading outline.

Depth outline problem: depth in buffer is not linear, recorded in logarithmic or reversed-Z format depending on platform. Direct comparison of depth values without linearization gives uneven line thickness. You need to linearize through LinearEyeDepth(depth, _ZBufferParams) — in ShaderGraph this is Linear Eye Depth node.

Heat Wave and Distortion Effects

Screen-space distortion for heatwave or magic portal: Blit shader, which shifts UV of the current frame by Normal map (flowing animated normal texture). Simple idea, but it hides a problem with render order.

Distortion shader must work with Opaque Texture (screenshot before transparent objects). In URP this is _CameraOpaqueTexture. If effect is applied as fullscreen pass — it sees the entire screen. If localized distortion is needed (only above a specific object, like hot asphalt) — you need a Distortion Renderer Feature, which renders distortion objects to a separate render texture, then applies it as a screen-space mask in final Blit.

For HDRP, distortion is a built-in Distortion render queue (separate from Transparent). Objects with Distortion enabled automatically enter the distortion accumulation pass. This is simpler than URP's custom approach.

Typical Mistakes When Developing Custom Post-Effects

Wrong RenderPassEvent. Post-effect placed in BeforeRenderingPostProcessing — but should be after Bloom, otherwise Bloom applies on top of custom effect. Event order in URP: Opaques → Skybox → Transparents → Post Processing → UI. Choosing RenderPassEvent determines where in this queue the custom pass is placed.

Missing Depth Priming Mode. On some mobile GPUs (especially Mali), depth buffer may be unavailable when reading in a custom pass if Depth Priming Mode in URP Renderer is set to Auto instead of Forced. This leads to black screen or incorrect depth values without any console errors.

VR incompatibility. Custom post-processing shader written without XR support breaks in VR — renders only for one eye or creates double-vision artefact. Blitter.BlitCameraTexture handles this automatically, but Custom Function nodes with manual UV calculation require UNITY_STEREO_EYE_INDEX_POST_VERTEX macro.

Post-Effect Type Timeline
Simple fullscreen effect (color correction, blur) 1–3 days
Outline by depth/normal + setup 3–5 days
Distortion effect (heatwave, portal) 3–6 days
Complex composite effect (multiple passes) 1–2 weeks

Cost is calculated individually. Need to know render pipeline, platform, and VR support presence.