depth-aware ar edge blending
date: January 12, 2021
Problem: Raw AR overlays look fake. Virtual objects sit on top of the camera feed with hard edges. A pipe behind a wall renders on top of it. Virtual edges stay sharp even when they should fade.
Solution: A multi-pass shader pipeline that analyzes both virtual geometry and the real camera feed, then blends them based on depth relationships and environmental cues.
Multi-pass depth strategy
Shell objects are environmental surfaces: walls, floors, ceilings. Inside objects are BIM elements: pipes, ducts, equipment. Two depth buffers enable comparison. Pixel-by-pixel, the shader knows whether a virtual pipe is in front of or behind a virtual wall.
RenderTexture depthShellRT = RenderTexture.GetTemporary(
source.width>>1, source.height>>1, 16, RenderTextureFormat.Depth);
RenderTexture depthInsideRT = RenderTexture.GetTemporary(
source.width>>1, source.height>>1, 16, RenderTextureFormat.Depth);
// Render shell objects depth
manual.cullingMask = settings.shellObjects;
manual.targetTexture = depthShellRT;
manual.RenderWithShader(depthShader, string.Empty);
// Render inside objects depth
manual.cullingMask = settings.insideObjects;
manual.targetTexture = depthInsideRT;
manual.RenderWithShader(depthShader, string.Empty);
Depth buffers render at half resolution. Depth variation is smooth across pixels, so full resolution is unnecessary. This saves 75% of memory bandwidth. The depth-only shader skips all material evaluation: no lighting, no textures, just vertex transformation and depth write.
Environment edge detection
Real-world edges provide cues for blending. Where the camera feed has edges, brighten virtual geometry near them. This creates the impression of light reflecting off real surfaces onto virtual objects.
The pipeline extracts luminance from the camera feed, then runs Sobel filtering in two passes (horizontal and vertical). Additive blending combines both into a single edge texture. The performance trick is pre-calculating texture coordinates in the vertex shader rather than the fragment shader.
Depth-aware composition
half4 frag (v2f i) : SV_Target
{
half insideDepth = LinearEyeDepth(tex2D(_DepthInsideTex, i.uv).r);
half shellDepth = LinearEyeDepth(tex2D(_DepthShellTex, i.uv).r);
half4 cameraColor = tex2D(_MainTex, i.uv);
half4 virtualColor = tex2D(texPipe, i.uv);
half edge = min(1.0, tex2D(texEdge, i.uv).r * 20.0);
half diff = insideDepth - shellDepth;
if (diff > 0 && virtualColor.a > 0.5)
{
if (edge > 0)
{
half3 hsv = rgb2hsv(cameraColor.rgb);
hsv.b *= (1 + edge * 0.25);
cameraColor.rgb = hsv2rgb(hsv);
}
return lerp(virtualColor, cameraColor, clamp(diff * scale + edge, lower, upper));
}
else
{
return lerp(cameraColor, virtualColor, virtualColor.a);
}
}
Positive depth difference means the virtual pipe is behind the virtual wall. For objects behind, edge brightening in HSV space preserves hue while blending based on depth distance. Adjusting brightness without changing hue preserves perceived color. For objects in front, simple alpha blending.
Occlusion visualization
For elements behind walls, blending toward transparency isn't enough. A specialized shader provides x-ray visualization: Fresnel rim lighting creates an edge glow, a pulsing gradient animates attention toward hidden elements, and distance-based fade reduces clutter.
The result shows users where BIM elements exist behind physical surfaces. On a construction site, this means verifying pipe runs before the drywall goes up.
Performance
Temporary render textures allocate from Unity's pool and release immediately. Zero allocation during steady state. The pool reuses allocations across frames, preventing memory churn and maintaining consistent frame times. On mobile, where thermal throttling kills performance after sustained load, consistent frame times matter more than peak performance.
Standard compositing addresses none of the spatial cues the eye expects. This pipeline analyzes the environment and adapts rendering in real-time. The technical complexity serves one goal: make virtual BIM elements feel like they belong in physical space.