F1 VIP Garage Tour
Qualcomm, Mercedes-AMG PETRONAS Formula 1 Team Lead Developer
AR Foundation AR Foundation Unity Unity Unity Cloud Volcap

An exclusive VIP headset experience deployed for Mercedes at Formula 1 race events on the Lenovo VRX headset. Users are welcomed by Lewis Hamilton, captured in volumetric video, who guides them through highly detailed 1:1 recreations of the Mercedes-AMG PETRONAS F1 garage, the engineering treehouse, and their fuel analysis lab. Interactive hotspots let users explore the Mercedes W14 car and watch exclusive interviews with Toto Wolff, James Allison, and Jodi Hutton.

Overview

This was designed as a showcase activation for event VIPs, blending brand storytelling with immersive technology. The flow takes the user through three recreated team environments — garage, treehouse, and lab — with Hamilton’s volumetric capture serving as their host.

At each station, users encounter points of interest (i.e. hotspots) that could either:

My Contributions

I was the lead and sole developer, responsible for everything in-headset.

Technical Highlights & Challenges

The biggest challenge was delivering high visual fidelity on low-power hardware (the Lenovo VRX). Unlike PC VR, this headset had serious performance constraints, but the experience needed to feel premium for Mercedes VIPs.

I leaned heavily on baked lighting, reflection probes, and light probes to achieve a realistic look at a fraction of the performance cost of realtime lighting. Planar reflections (like polished garage floors) had to be faked convincingly while staying performant—an effect I achieved here with the time-honored game dev tradition of “copy the scene and flip it upside down”, but with some low LOD models. Because the scenes were entirely static, I could push baked lighting quality to the maximum, which made the environments feel cinematic.

Volumetric capture (Lewis Hamilton as a tour guide) added complexity since the capture already contained baked-in lighting. I had to carefully balance environmental lighting so it matched without looking uncanny. I also created transition “rez-in” and out effects for the volcap.

On the architecture side, I designed a station-based content system. Each tour stop (garage, treehouse, lab) followed the same structure: intro, and then hotspots where user interaction would open supplemental media. I split the hotspots into a variety of shared implementations (label only, video, 360 video, etc.). Building it this way made the app easier to expand and maintain, which was very important since we didn’t have all of the hotspot copy and interviews until late in the project. This modular approach continued to prove useful in future iterations of this project, allowing for quick updates to our content.

Outside of the core project workflow, I worked closely with our internal motion and marketing teams to produce high quality renders of the experience for sizzle videos. This went well beyond a standard screen recording: I set up Cinemachine, created custom camera pans and zooms, and even created a system to record transform data from my headset that could be replayed in Unity for bespoke ‘in-headset’ shots.

This all culminated in a visually rich, smooth-running experience on constrained hardware—something that wowed Qualcomm’s engineers enough to request follow up meetings with their team to know how we pulled it off.

Results

Reflection

This project showed me just how far you can push static lighting techniques in XR. With fully static environments, I was able to achieve very high fidelity that ran smoothly on limited hardware.

The modular content architecture also proved its value, letting us extend and refine the project in later activations with minimal redevelopment effort.

Related Projects

© Nathan MacAdam, 2025