





A promotional AR app for the release of Spider-Man: No Way Home. Users can interact with Spider-Man in AR, explore his suits, and browse “Peter Parker’s Phone”, a simulated smartphone filled with bespoke film-related content.
Overview
The app was designed as an interactive marketing experience to drive engagement around the film’s release.
- Landing Page: Entry point with options for the AR photo op/suit viewer, Peter Parker’s Phone, ticket sales, and a Snapchat social lens linkout.
- AR Photo Op: Users selected a Spider-Man suit, placed him in their environment, and captured photos or videos.
- Peter Parker’s Phone: A simulated phone UI where fans could browse through voicemails, read texts from core characters, and scroll through Peter’s photo gallery. Content was fully localized and designed for post-launch updates.
Peter Parker's Phone
My Contributions
As a developer on this project, I worked on many different facets, including backend and live-ops, front-end implementation, and visuals.
-
Built the content ingest system that parsed JSON from a CMS and generated Unity prefabs for messages, voicemails, and photos in Peter Parker’s Phone.
-
Implemented localization across UI elements, dynamically handling layout and font rendering.
-
Assisted with AR suit selection and interaction logic for the photo op feature.
-
Created loading screen VFX, including animated background effects.
-
Implemented analytics using Firebase
Technical Highlights & Challenges
I designed and implemented the system that parsed CMS-delivered JSON into prefabs. This allowed us to support many localizations and make post-launch updates for complex objects like the home screen cards, and content on Peter Parker’s Phone like the text messages, voicemails, and photo gallery. Instead of just solving the immediate need, I deliberately structured the ingestion system as a reusable tool. This was one of my first chances to build developer infrastructure that improved future project velocity, and was reused on future projects like Dinotracker AR.
Since Unity’s Canvas system is not naturally adaptive, I reworked layouts to handle multiple screen sizes and dynamic text lengths. This included building loading states, scaling rules, and UI that remained legible across different devices and localization.
At one time, we planned to have Spider-Man swing out from a Doctor Strange-style portal for the AR reveal. I prototyped the VFX for both the portal itself, and the shaders necessary to render Spider-Man inside the portal. The shaders used a stencil-buffer approach to render one copy of Spider-Man inside the portal, which was masked with the stencil operation. A second Spider-Man mesh was rendered with an alpha-clip using the portal positon and forward to clip the Spider-Man mesh contents behind the portal. Combined together, this created the full inside-to-outside the portal transition. I also spent a pretty significant amount of time developing the particles for the portal, utilizing multi-layered spark particles, rotational noise, and gravity to mimic sparks flying off the rim of the portal. Ultimately, we ended up scrapping the portal reveal, as the client needed it to look 1-to-1 with the film, which would have required a significantly higher particle budget, realistic collisions, bloom/post-processing, and some kind of simulated aerodynamic flow to the particles. Given a lot more time (the portal was prototyped pretty late into the project) I probably could have gotten it there, but there were higher priority tasks that came up as we approached the app release.
Early prototype testing masking
Final iteration of portal VFX
I also developed VFX for the app’s loading screen, mimicking the design of the inside-out suit from the movie. Using the background provided by our creative team, I made flow map texture and a custom shader to create a “signals on a circuit board” effect.


Base texture is combined with the flow map to create the flowing circuit effect
Results
-
Over 1.1M total downloads (327,000+ iOS, 850,000+ Android).
-
Named a 2023 Webby Awards Honoree for Best Use of Augmented Reality.
-
Strong engagement with the AR photo op, contributing to buzz around the film release.
Reflection
This was one of my first professional large-scale projects. It was where I learned how to collaborate closely with other developers and work within a multi-disciplinary team on a production timeline. I’m proud of my contributions, especially the CMS ingest system that we continued to reuse on future projects.
Related Projects






A promotional AR app for the release of Spider-Man: No Way Home. Users can interact with Spider-Man in AR, explore his suits, and browse “Peter Parker’s Phone”, a simulated smartphone filled with bespoke film-related content.
Overview
The app was designed as an interactive marketing experience to drive engagement around the film’s release.
- Landing Page: Entry point with options for the AR photo op/suit viewer, Peter Parker’s Phone, ticket sales, and a Snapchat social lens linkout.
- AR Photo Op: Users selected a Spider-Man suit, placed him in their environment, and captured photos or videos.
- Peter Parker’s Phone: A simulated phone UI where fans could browse through voicemails, read texts from core characters, and scroll through Peter’s photo gallery. Content was fully localized and designed for post-launch updates.
Peter Parker's Phone
My Contributions
As a developer on this project, I worked on many different facets, including backend and live-ops, front-end implementation, and visuals.
-
Built the content ingest system that parsed JSON from a CMS and generated Unity prefabs for messages, voicemails, and photos in Peter Parker’s Phone.
-
Implemented localization across UI elements, dynamically handling layout and font rendering.
-
Assisted with AR suit selection and interaction logic for the photo op feature.
-
Created loading screen VFX, including animated background effects.
-
Implemented analytics using Firebase
Technical Highlights & Challenges
I designed and implemented the system that parsed CMS-delivered JSON into prefabs. This allowed us to support many localizations and make post-launch updates for complex objects like the home screen cards, and content on Peter Parker’s Phone like the text messages, voicemails, and photo gallery. Instead of just solving the immediate need, I deliberately structured the ingestion system as a reusable tool. This was one of my first chances to build developer infrastructure that improved future project velocity, and was reused on future projects like Dinotracker AR.
Since Unity’s Canvas system is not naturally adaptive, I reworked layouts to handle multiple screen sizes and dynamic text lengths. This included building loading states, scaling rules, and UI that remained legible across different devices and localization.
At one time, we planned to have Spider-Man swing out from a Doctor Strange-style portal for the AR reveal. I prototyped the VFX for both the portal itself, and the shaders necessary to render Spider-Man inside the portal. The shaders used a stencil-buffer approach to render one copy of Spider-Man inside the portal, which was masked with the stencil operation. A second Spider-Man mesh was rendered with an alpha-clip using the portal positon and forward to clip the Spider-Man mesh contents behind the portal. Combined together, this created the full inside-to-outside the portal transition. I also spent a pretty significant amount of time developing the particles for the portal, utilizing multi-layered spark particles, rotational noise, and gravity to mimic sparks flying off the rim of the portal. Ultimately, we ended up scrapping the portal reveal, as the client needed it to look 1-to-1 with the film, which would have required a significantly higher particle budget, realistic collisions, bloom/post-processing, and some kind of simulated aerodynamic flow to the particles. Given a lot more time (the portal was prototyped pretty late into the project) I probably could have gotten it there, but there were higher priority tasks that came up as we approached the app release.
Early prototype testing masking
Final iteration of portal VFX
I also developed VFX for the app’s loading screen, mimicking the design of the inside-out suit from the movie. Using the background provided by our creative team, I made flow map texture and a custom shader to create a “signals on a circuit board” effect.


Base texture is combined with the flow map to create the flowing circuit effect
Results
-
Over 1.1M total downloads (327,000+ iOS, 850,000+ Android).
-
Named a 2023 Webby Awards Honoree for Best Use of Augmented Reality.
-
Strong engagement with the AR photo op, contributing to buzz around the film release.
Reflection
This was one of my first professional large-scale projects. It was where I learned how to collaborate closely with other developers and work within a multi-disciplinary team on a production timeline. I’m proud of my contributions, especially the CMS ingest system that we continued to reuse on future projects.