back to index

from festival fields to vr: building festvr

Problem: Festival production teams design events from 2D floor plans. Top-down maps cannot show whether a structure blocks sightlines, whether a tent fits the terrain slope, or what the experience feels like for 50,000 visitors walking through a gate.

I was Technical Producer at Chasing the Hihat Group, an Amsterdam event production company founded in 2011. The company grew to 16 employees and produced over 150 events a year for clients across the Netherlands and Germany. Every event started with the same workflow: a 2D CAD drawing, a site visit, and a lot of imagination. The gap between the plan on screen and the reality on the ground caused costly mistakes. Stages placed where they blocked emergency routes. Structures that didn't fit the terrain. Vendor areas that felt cramped at capacity.

Solution: Build a VR tool that lets designers walk through a photogrammetry-scanned venue and draw festival layouts at 1:1 scale. Three years of R&D, from LIDAR experiments to a working MVP.

The R&D Journey

2015 started with LIDAR scanning. We tested terrestrial laser scanners in the Diemerbos forest near Amsterdam. The results were geometrically precise but impractical: expensive equipment, slow capture, and poor results in natural environments where foliage scattered the laser returns.

In 2016 we switched to drone photogrammetry. A DJI Phantom 4 Pro with automated flight plans could cover an entire festival ground in a few hours. The output was cheaper, faster, and better suited to the organic terrain of outdoor venues. Green fields, tree lines, waterways: photogrammetry captured what LIDAR struggled with.

2017 was the year we built the processing pipeline into Unity. Raw photos became 3D meshes, and those meshes became walkable VR environments where you could stand at the festival entrance and look across the entire site. By January 2018 we had a working MVP: designers could scan a venue, load it in VR, draw a complete festival layout, and export it to CAD software for engineering.

Architecture

The system has five components connected through a central server. The festVR Server manages scene state: environments, objects, shapes, and file persistence. The HTC Vive client is the "Master Builder" where designers draw layouts in VR. An ARKit viewer lets team members walk through designs on iPad during site visits. A web viewer provides lightweight 3D access for stakeholders. A real-time CAD bridge syncs drawn shapes with professional drafting software.

Scene System

The Unity application uses additive scene loading. A persistent Management scene holds all singleton managers (scene control, input, file persistence, networking). Environment scenes load on top of it. Each scanned venue is a separate scene containing the photogrammetry mesh and terrain data.

public enum Scene
{
    None = -1,
    Management = 0,
    Menu = 1,
    Empty = 2,
    Weeze_Airport = 3,
    NDSM = 4,
    Biddinghuizen = 5,
    Test
}

The SceneController handles async loading and unloading. When a designer selects a venue, the current environment unloads and the new one streams in. The Management scene never unloads, so all managers, session state, and drawn objects persist across environment switches.

public IEnumerator Load(Scene scene)
{
    AsyncOperation loading = SceneManager.LoadSceneAsync(
        (int)scene, LoadSceneMode.Additive);
    while (!loading.isDone)
    {
        yield return null;
    }

    ActiveScene = scene;
    if (OnSceneLoaded != null)
        InvokeOnLoad();
    if (scene >= Scene.Empty && (int)scene <= 90)
        FileManager.LoadState(FileManager.Settings.SaveName);
}

On scene load, the controller fires a delegate chain. FileManager.OnSceneLoaded restores the last saved state for that environment, so designers resume exactly where they left off.

CAD Bridge

The ConnectionManager maintains a real-time link to professional drafting software over TCP using WCF. When a designer finishes drawing a shape or places an object in VR, the bridge serializes it to JSON and sends it to the CAD application through a service contract.

[ServiceContract]
public interface ISendCommand
{
    [OperationContract]
    void DrawCircle(string jsonString, bool hatched);
    [OperationContract]
    void DrawLine(string jsonString, bool closed, bool hatched);
    [OperationContract]
    void InsertBlock(string jsonString);
    [OperationContract]
    void DeleteShape(string id);
}

The bridge uses UDP broadcast for discovery: the VR application scans the local network, finds the CAD workstation, and establishes the HTTP channel automatically. No manual IP configuration. Designers draw in VR, engineers see the result in their CAD tool within seconds.

Why It Mattered

Festival production is a high-stakes, low-margin business. A misplaced stage costs tens of thousands of euros to relocate. An emergency route that looks clear on a 2D plan might be blocked by terrain in reality. festVR gave production teams the ability to catch these problems before a single truck arrived on site.

The tool went from experiment to daily use across multiple venue scans. Three environments shipped with the MVP: an airport venue in Germany, an industrial wharf in Amsterdam, and a polder field in Flevoland. Each presented different terrain challenges that validated the photogrammetry approach over LIDAR.

Result: A working VR design tool that connected drone-scanned venues to CAD engineering workflows, built over three years of incremental R&D.

I was Technical Producer at Chasing the Hihat and built the festVR prototype from 2015 to 2018.