Kittch AR Cooking
Qualcomm Lead Developer
Unity Unity AR Foundation AR Foundation Snapdragon Spaces Snapdragon Spaces

A proof-of-concept AR glasses cooking demo built to showcase Snapdragon Spaces features. First unveiled as part of the AWE 2023 Keynote and later demoable at the Snapdragon Spaces booth, the experience was created in partnership with cooking livestream platform Kittch. The app guided users through recipe steps using spatial panels for video, instructions, and timers — blending practical cooking utility with next-gen AR glasses UX.

Overview

AR Cooking was created to demonstrate Snapdragon Spaces at trade shows and conferences, including its debut during the AWE 2023 XR Keynote. Attendees could go hands-on with the demo at the Snapdragon Spaces booth, experiencing how AR glasses can make an everyday task like cooking feel futuristic yet useful.

The app ran on a Motorola phone tethered to Lenovo ThinkReality A3 glasses, using Spaces’s Dual Render Fusion to split rendering and input between the phone and glasses. The app also demos unique hand tracking and spatial anchors technology unique to Snapdragon Spaces. The demo presents a practical and user-friendly look at how AR could enhance everyday cooking.

Users began by selecting a recipe on the phone. Once the glasses were on, AR content could be anchored to real-world surfaces via gaze targeting and phone confirmation. After placement, the cooking experience unfolded entirely in AR via hand gestures — freeing the user’s hands for cooking.

In AR, the cooking UI is structured into three panels:

My Contributions

As the sole developer, I implemented the entire application, including:

Technical Highlights & Challenges

The main challenge was working with bleeding-edge Snapdragon Spaces features that weren’t publicly available yet. Because this app doubled as a showcase for upcoming tech, I had to integrate preview builds directly with Qualcomm engineers’ support.

Snapdragon Spaces feature highlights:

I also worked closely with our UX and creative team to prototype UI. We wanted the UI to seamlessly move around the kitchen without being stuck to their face, so I created a number of generic components for creating axis-aligned panels, orbitally aligned panels, and draggable/dockable content for us to evaluate what felt best. We have since used these components across multiple other projects.

Example UI/UX Implementations

Axis-bound UI slides along a line in space while staying aligned with the camera

Orbitally-bound UI slides along a arc in space while staying aligned with the camera

Dockable UI can be attached to an axis or orbital arc, but breaks off after a certain threshold

Another challenge was the hardware: the A3 glasses were enterprise-oriented with limited FOV and weaker specs. This made optimization and thoughtful UX critical so the app still felt practical and high-quality despite the limitations. The Dual Render Fusion/tethering setup also made testing in editor difficult, requiring frequent on-device testing for feature implementation.

In short, AR Cooking was as much about pioneering glasses-ready UX as it was about cooking. The constraints of tethered hardware and early SDKs forced creative technical solutions, but also set the groundwork for interaction patterns I’ve reused on later AR projects.

Results

Reflection

AR Cooking taught me how to:

I’m most proud of the UX systems built here, which influenced later projects, and I’d love to port this concept to more consumer-friendly AR glasses when the hardware is ready. The combination of recipe guidance, video walkthroughs, and spatial utilities seems like a genuinely useful everyday AR product.

Related Projects

© Nathan MacAdam, 2025