Back in 2018 Artefact was hired by Facebook Reality Labs to help define a new product, code-named Nazare. A few of us at Artefact created a product strategy recommendation, including the hardware architecture, device ecosystem, top scenarios, and an ethical framework for the product's outcomes.
The project helped increase interest for the AR glasses product in Facebook Reality Labs and we were asked to follow up with a project where we created prototypes and concept videos of the top scenarios. I led the team of designers, game developers, and motion artists to generate concept videos and prototypes. They were created in Unity and included live remote avatars, shared 2D and 3D media, hand input, and OS concepts.
Rather than create all of these in After Effects and C4D, I proposed that we use the prototype to create live performances and film the vision videos through our Zed Mini cameras attached to our Rifts. We were able to achieve all this in 4 months and demo our live remote presence at an internal Symposium, garnering more interest in the product.
The project was green-lit and I was hired by Meta full-time as a product designer to ship the product. The project was renamed to Nazare, and four years later the product hasn't shipped but is hopefully on its way.
Here's a brief video that I directed for the reveal of Nazare at Meta Connect 2022.
This video shows a few features I worked on while working at Meta.
Augmented Messaging
Allows people to connect via any of the Meta-owned communication channels via standard 2D messaging, but also in new forms of spatial communication. The video here shows a game being spawned from a message, but still retaining the link to the people who sent it. I led the design for this feature initially, then handed it off because I was being asked to work on calling.
Augmented Calling
Allows people to see and hear each other remotely as if they were in the same room. All while being able to interact with digital content together. We explored and prototyped many options for laying out content and people in physical space to maximize the feeling of presence and togetherness. In this video, you can see the concept that the content is the glue that remote people gather around. Moving the content moves the people along with it. We made recommendations and worked with the hardware team to include sensors that would enable these scenarios. After a year or two of working on this feature and building up a team of other designers, I moved to games.
AR Games
use their environment to meaningfully impact gameplay. This video shows a placeholder of what an AR game might be. Recently in the AR games team, I was able to flesh out what an AR game actual will be. There I was leading the design for an AR puzzle game. I was driving the game's level design, mechanics, story, and aesthetics. I had just finished my 33rd level when I was let go. I'm really hoping it will continue and ship in the near future.
Experience Narrative
I also worked on something we called the Experience Narrative for the Nazare team. I proposed that while the rest of the experience team was going deep on our individual applications, we'd have one designer working broadly across all the experiences and create vision videos that would show a day in the life of our future product. Another designer picked up this work and took it all the way to shooting until he left Meta. I picked it up in post-production and saw it through to its finish. It was a valuable artifact for the team to think across silos, paint a realistic vision for the product, and give engineering a target to shoot for. All this to say, I'm very comfortable directing and producing concept videos.