


A series of experimental XR installations built on Varjo XR-3 and XR-4 headsets. Leveraging chroma keying and a full green-screen stage, these installations blended real-world props, actors, lighting, and practical stage effects with high-fidelity virtual environments for deeply immersive mixed reality showcases. As this was confidential client work, only a high-level description can be shared.
Overview
Working at Trigger, I’ve to led development on several experimental XR installations.These projects pushed the limits of mixed reality by combining Varjo’s pass-through and chroma-keying capabilities with large-scale physical stages. Users stepped into bespoke XR experiences where real props, practical effects rigs, lighting, and actors were perfectly synchronized with virtual environments. The goal was to showcase the future of XR technology at the highest level of fidelity possible, running on cutting-edge enterprise headsets and top-tier desktop hardware. Because this was confidential client work, available previews are limited to non-branded and high-level materials.
My Contributions
I worked as lead (and sometimes sole) developer across multiple variations of these projects.
-
Integrated the Varjo SDK and implemented mixed reality alignment workflows inside Unity.
-
Built a networked secondary proctor interface for operators to guide and control the experience in real time.
-
Implemented networked DMX messaging to control stage lighting and practical effects dynamically from the XR application.
-
Integrated support for a Motion Systems professional motion platform, enabling synchronized motion feedback (vehicle motion seat buck) as part of the immersive experience.
-
Oversaw all on-site testing during development, collaborating with the client and our creative team to align on project needs.
-
Developed high-fidelity Unity scenes using HDRP tailored to the power of Varjo XR hardware.
-
Explored and prototyped a native C++/GLSL post-processing layer for chroma key green-spill suppression.
Technical Highlights & Challenges
One of the biggest technical hurdles was achieving perfect alignment between real-world stage props and their virtual counterparts. Since the Varjo system relies on chroma keying, we had to precisely match physical green boxes and interactive props with digital assets so that the illusion of mixed reality felt seamless to the user. Any misalignment would instantly break immersion, so I developed custom workflows to consistently calibrate physical and virtual elements.
Another major challenge was dealing with green-spill from the stage itself. Because the Varjo lacks built-in green-spill suppression, the cameras would pick up unwanted green light bounce from the physical set. I prototyped a native C++ headset application to apply post-processing corrections, which showed promising results, though ultimately the approach wasn’t used in production due to conflicts with Unity’s rendering pipeline.
Demoing custom green spill suppression. All the changes on the screen are happening on headset using chroma keying. Note the reduction of green bounce light on my hand after effect is applied.
Beyond visual fidelity, these projects required integrating external hardware systems into the XR pipeline. One of the most interesting was a MotionSystems professional motion platform, typically used in enterprise simulation. I integrated their SDK into the experiences to synchronize the motion platform with XR content, so users felt physical feedback that reinforced the virtual environments. Combined with DMX-controlled lighting and effects, the installation blended virtual and physical cues in a way that maximized immersion.
Across these projects, I created a variety of visual effects for scene transitions. We used asynchronous scene loading alongside full-view transition effects to seamlessly move the user between virtual environments.
Example 360 transition effects
Additionally, I created a secondary proctor interface so operators could dynamically control the flow of each installation. This required a lightweight networking solution to communicate between the main XR app and the control tool, giving the proctor the ability to trigger scene changes, synchronize effects, and respond to the pacing of the user’s interaction, while keeping everything separated from the application running the XR experience.
Finally, these projects represented a rare case where hardware limitations were not the primary concern. Running on dedicated high-end PCs with bleeding-edge enterprise-grade HMDs, I was able to push Unity’s HDRP to deliver visuals far beyond what is typically possible in mobile or standalone XR projects. That freedom meant exploring advanced lighting, detailed assets, and visual polish that highlighted the raw power of the platform.
Results
The installations successfully demonstrated Varjo’s XR capabilities and left a strong impression on users and clients. By blending physical stagecraft with digital immersion, the experiences showcased a vision of what high-end XR can achieve.