


A proof-of-concept AR glasses cooking demo built to showcase Snapdragon Spaces features. First unveiled as part of the AWE 2023 Keynote and later demoable at the Snapdragon Spaces booth, the experience was created in partnership with cooking livestream platform Kittch. The app guided users through recipe steps using spatial panels for video, instructions, and timers — blending practical cooking utility with next-gen AR glasses UX.
Overview
AR Cooking was created to demonstrate Snapdragon Spaces at trade shows and conferences, including its debut during the AWE 2023 XR Keynote. Attendees could go hands-on with the demo at the Snapdragon Spaces booth, experiencing how AR glasses can make an everyday task like cooking feel futuristic yet useful.
The app ran on a Motorola phone tethered to Lenovo ThinkReality A3 glasses, using Spaces’s Dual Render Fusion to split rendering and input between the phone and glasses. The app also demos unique hand tracking and spatial anchors technology unique to Snapdragon Spaces. The demo presents a practical and user-friendly look at how AR could enhance everyday cooking.
Users began by selecting a recipe on the phone. Once the glasses were on, AR content could be anchored to real-world surfaces via gaze targeting and phone confirmation. After placement, the cooking experience unfolded entirely in AR via hand gestures — freeing the user’s hands for cooking.
In AR, the cooking UI is structured into three panels:
- Left: Ingredients, highlighting those relevant to the current step.
- Right: Step instructions + a timer utility, allowing world-anchored timers (e.g. hovering above pots/pans).
- Center: Step-segmented video of a chef from Kittch cooking alongside the user.
My Contributions
As the sole developer, I implemented the entire application, including:
-
App flow and state management: recipe selection → preview → AR setup & placement → immersive cooking experience.
-
Integrated Snapdragon Spaces SDK features, including Dual Render Fusion, plane detection, spatial anchors, and hand tracking.
-
Developed the UI logic and interaction scripts, including gesture-based controls designed for cooking (hands-free when dirty).
-
Developed a custom suite of resuable XR UX components: axis-aligned panels, orbital alignment, draggable/dockable elements
-
Built utility features like world-space timers and segmented video playback for step-by-step cooking instruction.
-
Implemented video playback system segmented by recipe steps.
-
Collaborated with UX and creative teams to realize spatial UI designs in a glasses-friendly format.
Technical Highlights & Challenges
The main challenge was working with bleeding-edge Snapdragon Spaces features that weren’t publicly available yet. Because this app doubled as a showcase for upcoming tech, I had to integrate preview builds directly with Qualcomm engineers’ support.
Snapdragon Spaces feature highlights:
-
Dual Render Fusion: balancing simultaneous rendering and input across the tethered Moto phone and the A3 glasses. I built a system where the phone handled recipe browsing and content placement, then seamlessly transitioned to AR glasses for hands-free cooking.
-
Spatial Anchoring: stored anchors allowed users to skip re-placing content across sessions.
I also worked closely with our UX and creative team to prototype UI. We wanted the UI to seamlessly move around the kitchen without being stuck to their face, so I created a number of generic components for creating axis-aligned panels, orbitally aligned panels, and draggable/dockable content for us to evaluate what felt best. We have since used these components across multiple other projects.
Example UI/UX Implementations

Axis-bound UI slides along a line in space while staying aligned with the camera