back to index

rendering bim in ar

Problem: BIM models aren't built for real-time rendering. They're optimized for accuracy and information, not performance. A single pipe might have hundreds of triangles when a dozen would suffice. Try to render a complete building on a mobile phone and you'll get single-digit frame rates.

Solution: A rendering pipeline that bridges the gap between BIM's precision and AR's performance demands. Import geometry, construct optimized meshes, apply materials, organize for filtering, and blend seamlessly with the camera feed.

The goal was 60fps on mobile hardware with filtering, selection, and real-time updates. That meant rethinking how BIM data flows through the system.

Pipeline Overview

Five stages transform BIM data into rendered AR:

Each stage has a single responsibility. Mesh generation handles geometry. Materials handle appearance. Hierarchy enables filtering. Rendering integrates with AR. This separation means changing one stage doesn't break another.

Mesh Generation from BIM Vertices

BIM elements arrive as raw data: vertex arrays and index lists. Unity needs Mesh objects. The conversion processes each element type once during import.

Each element type contains packed vertex data (position, normal, UV in sequence) and submeshes that define triangle lists with associated colors. A single door might have three submeshes: frame, panel, and glass, each with different materials.

Mesh mesh = new Mesh();
mesh.vertices = ConvertToVector3Array(elementType.vertices);

mesh.subMeshCount = elementType.submeshes.Length;

for (int i = 0; i < elementType.submeshes.Length; i++)
{
    mesh.SetTriangles(elementType.submeshes[i].indices, i);
}

mesh.RecalculateNormals();
mesh.RecalculateBounds();

Vertices map directly from BIM's coordinate space with the handedness correction transform handling left/right-handed conversion. Normal recalculation happens once during import because BIM data sometimes lacks normals or has incorrect ones. Recalculating ensures consistent lighting across all elements.

Material Handling Strategy

BIM elements carry material information through colors and categories. Rather than creating unique materials for every element (which would destroy batching), the system uses template materials.

Material baseMaterial = Resources.Load<Material>("Materials/Template_Mtl");

foreach (Submesh submesh in elementType.submeshes)
{
    Material instanceMaterial = new Material(baseMaterial);
    instanceMaterial.color = submesh.color;
    materials.Add(instanceMaterial);
}

MeshRenderer renderer = prefab.AddComponent<MeshRenderer>();
renderer.materials = materials.ToArray();

The template defines shader and rendering properties. Instance materials inherit these settings and apply element-specific colors. This approach balances flexibility and performance: shared shaders batch effectively while per-element colors enable visual distinction.

Category-Based Organization

Construction professionals don't want to see everything at once. A structural engineer needs to see beams and columns, not every door handle. An MEP coordinator needs mechanical systems visible while walls hide. The category system makes this possible.

The importer handles 23 distinct BIM categories using a flags enum:

public enum Category
{
    None = 0,
    Wall = 1 << 1,
    Floor = 1 << 2,
    Door = 1 << 3,
    Window = 1 << 4,
    Structure = 1 << 5,
    InstallationMechanical = 1 << 6,
    // ... 17 more categories
}

The flags pattern enables efficient filtering. Check multiple categories with bitwise operations. Toggle visibility without traversing the entire hierarchy.

public void SetCategoryVisibility(Category categories, bool visible)
{
    foreach (Transform categoryTransform in categoryTransforms.Values)
    {
        Category cat = (Category)Enum.Parse(typeof(Category), 
            categoryTransform.name);
        
        if ((categories & cat) != 0)
        {
            categoryTransform.gameObject.SetActive(visible);
        }
    }
}

This enables practical use cases: show only structure and mechanical systems, hide walls to see interior layout, focus on specific building systems, or compare design alternatives. The filtering happens instantly because it's just toggling GameObjects.

Storey-Based Spatial Organization

Beyond categories, storeys provide spatial filtering. A ten-storey building has ten times the geometry of one floor. If you're on floor three, rendering floors one through ten wastes GPU cycles on geometry you can't even see.

Each storey loads independently with UI controls for toggling visibility per floor. Combined with category filtering, this provides precise control. "Show mechanical systems on ground floor" becomes a simple filter operation rather than a complex spatial query. Two toggles instead of traversing thousands of elements.

Layer Management for Rendering Control

Unity's layer system separates objects for rendering and raycasting. Different cameras see different layers. The main camera shows all geometry. A depth camera renders only specific elements. Selection raycasts ignore certain categories so touch input hits only interactive elements.

The depth-aware edge blending system relies entirely on layer separation. Shell objects (walls, floors) render to one depth buffer. Inside objects (pipes, equipment) render to another. The composition shader compares both buffers to determine what's visible through the structure.

Performance Optimization Strategies

Mobile AR demands efficiency. The phone is already running camera processing, AR tracking, and the regular OS workload. What's left for rendering BIM geometry has to be used carefully.

GPU instancing batches similar elements automatically when they use identical materials and shaders. This is why the type library pattern matters: 500 doors using the same type become one draw call instead of 500.

Occlusion culling prevents rendering hidden geometry. BIM models have many occluded elements; walls hide interior systems. Unity's occlusion system recovers significant performance by simply not drawing what can't be seen.

Level-of-detail management handles complex geometry. Near the camera: full detail. Far from camera: simplified mesh. The transition is imperceptible but the performance impact is substantial.

Accurate bounds enable effective frustum culling. Unity's auto-calculated bounds sometimes overestimate, causing geometry to render when it's actually off-screen. Manual bounds calculation from actual vertex positions improves culling effectiveness.

Draw call batching ties everything together. Shared materials enable dynamic batching for small meshes and GPU instancing for larger ones. Material instances use the same shader, maximizing batch opportunities.

Scale Transformation Handling

BIM models use real-world units. Some are in millimeters, some in meters, some in feet. Unity defaults to meters but scale flexibility is essential for working with data from different sources.

A centralized scale factor adapts imported geometry to Unity's coordinate system, enabling work with BIM data in any unit system without modifying source files. Scale affects everything: element positions, camera movement speeds, collision detection thresholds, UI measurements. Centralizing this conversion prevents unit mismatches throughout the application.

Import Reporting and Debugging

When an import fails on a model with 50,000 elements, you need to know which element caused the problem. The import report system tracks operations hierarchically, nesting tasks by storey and type. Which storey failed? Which element type caused issues? The hierarchical report pinpoints problems without wading through 50,000 log entries.

Model Controller Integration

The ModelController component provides a clean API for the rest of the application. Other systems reference it to control category visibility, handle element selection, query metadata, and manipulate transforms. They don't need to know the internal structure: hierarchy organization, layer assignments, coordinate corrections. The complexity stays contained.

Mobile AR Considerations

Mobile hardware has constraints that desktop developers rarely think about. Memory pressure is constant: the type library reduces geometry duplication, texture atlasing combines multiple materials, mesh compression reduces storage.

Fill rate limits cap how many pixels you can shade per frame. Depth prepass renders geometry front-to-back. Overdraw is minimized through proper sorting and culling.

Bandwidth constraints matter more than raw compute. Vertex data packing reduces memory bandwidth. Index buffers use 16-bit indices where possible.

Thermal throttling is the silent killer. A phone that starts at 60fps drops to 30fps after five minutes of sustained load. Frame budget monitoring prevents this by adapting quality settings to device capabilities.

These optimizations maintain performance across device generations, from flagship phones to mid-range tablets.

AR Camera Integration

The BIM model renders to texture, not directly to screen. This indirection enables everything that makes AR feel convincing: composition shaders that blend virtual geometry with the camera feed, edge detection, and depth-based occlusion.

Separating BIM rendering from camera rendering provides control. Different cameras can have different field-of-view, different culling masks, different post-processing stacks. The final composition combines all these layers into a coherent AR view.

Why This Architecture Works

The pipeline separates concerns: import handles data conversion, mesh generation handles geometry, materials handle appearance, hierarchy handles organization, rendering handles display. Each stage operates independently.

Change import format? Mesh generation unchanged. Adjust materials? Import unaffected. Modify hierarchy? Rendering continues working.

This separation enables different import sources (Revit, IFC, custom formats), alternative rendering strategies (standard, stylized, x-ray), multiple organization schemes (categories, systems, zones), and various optimization techniques (LOD, instancing, streaming).

The architecture accommodates BIM's complexity while maintaining real-time rendering performance. That combination is what enables practical AR-BIM applications on mobile devices.