Immersive Experience Design: Expert Techniques
Self-guided. Co-created. Multisensory. Nonlinear. These are just some of the qualities attributed to immersive experience design, a wide-ranging discipline that encompasses augmented reality (AR), virtual reality (VR), interior architecture, sound design, visual storytelling, and more. British creative director Alex McDowell invented the phrase “immersive design” in 2007 to describe his approach to creating VR experiences, but the current understanding of the term can apply to any designed experience that seeks to draw people into a world—whether that world is virtual, digital, physical, or some combination of the three.
What may have seemed esoteric in 2007 has now gone mainstream: People routinely blur the lines between various “worlds” with massively multiplayer online games, AR filters on social media, and spatial audio in AirPods. For brands and products aiming to engage new users, focusing on immersive design feels less like science fiction and more like table stakes.
In this Q&A discussion, two Toptal experts share immersive experience examples and their ideas on how to make it come alive in practice. Dwayne Grech has more than 15 years of experience helping to create immersive designs such as AR/VR interfaces, touch walls, and projection mapping for brands like Nike, Bank of America, and the city of Las Vegas. During his 25-year design career, Clarke Noone has collaborated on omnichannel experiences for clients like Google, Manchester United, and Formula 1.
This conversation has been edited for clarity and length.
To start things off, how do you define “immersive experience design” in the context of your own work?
Dwayne Grech: This is the ongoing challenge, and it evolves so much. I see the term as an umbrella over multiple disciplines that would’ve normally existed in isolation over the course of a project. Let’s say you have a global campaign coming from an advertiser: We’d typically have to bring in architects, storytellers, production companies, set designers, developers, and they’d all have their own separate timelines, objectives, and styles. And we were hard-pressed, behind the scenes, to try to stitch them together. You didn’t feel an organic process from A to Z.
Now, under the umbrella of immersive design, we have architects, futurists, storytellers, and animators working together. It’s about thinking holistically: How do we benefit from all the different disciplines at once? If this were a complicated Venn diagram, we’re at the intersection of communications, architecture, psychology, storytelling, and animation. This is exciting because it allows a designer like me to play in a diverse sandbox and have a lot of tools, as opposed to being pigeonholed.
Clarke Noone: You’re not just a UI or UX designer anymore. You’re considering the multi-sensory experience and bringing in elements of haptics and spatial audio. That being said, the fundamental design principles that we have spent decades perfecting are still applicable to the immersive space—you still need intuitive UX flows and cohesive UI patterns built from well-maintained component libraries.
I’m excited by all of this because I get to engage my main passions: music, design, and gaming. To have an arena where I can bring all these interconnected elements into my ideation process is really fulfilling.
Grech: Designers often talk about bringing the audience as a collaborative actor into storytelling, as opposed to the audience being a passive observer. There are aspects of audience/user involvement in VR, but whenever the audience becomes part of the storytelling, that’s the immersion. You’re not going to see a movie. You are in the movie.
Noone: Dwayne hit the nail on the head there. VR is immersive. But interactive projection mapping can also be immersive. The magic of wrapping the audience in a multisensory experience that transports them to another destination—real or imagined—and creating these new worlds using nothing more than a modern laptop is truly thrilling.
Can you discuss the first project in which you successfully used immersive design principles to create a compelling experience?
Noone: I did a project for Arthrolense, bringing knee surgery into augmented reality. Two world-renowned knee surgeons came up with the concept of using the HoloLens kit to transform the process of knee surgery, which is a very difficult surgery to perform. If you get it wrong, the patient potentially doesn’t walk again. As much as it’s been refined with modern technology, when you’re drilling into the knee and making incisions, there’s always an element of guesswork. They wanted to remove the guesswork by being able see the entire operation through the HoloLens and using augmented reality to calculate incisions with submillimeter precision.
As much as the final product was about taking a scan of the knee and being able to see it in 3D from all angles, the principles for 2D UI and UX design also applied. In terms of the actual on-screen UI, the surgeon would begin the operation with an iPad. That would set up all the measurements to feed into the HoloLens, which would then create a mesh of the knee using those numbers. It was the first immersive design project that I’d worked on, and it was interesting to see how much of my previous work as “just” a UI/UX designer was still relevant.
Grech: My first real dive into proper immersive experience design was in 2017 when I was working for the Las Vegas tourism board. They came to me and my team saying, “We need to flip the script on what the young generation thinks about Las Vegas. It’s no longer about gambling. It’s about experiences, it’s about entertainment, it’s about art and culture.” These things were not really synonymous with Vegas, so they needed an engaging approach.
We said, “Let’s create five distinctive Vegas experiences in VR. We’ll find internationally known artists, bring them to Las Vegas, and have them give their take on Las Vegas in their style, and we’ll bring it to life in 3D.”
The immersive design came into play in this gallery space that we took over. The graphic artist Signalnoise created VEGAS RISING, one of five VR experiences in the gallery. Signalnoise wanted to capture the fun and excitement of Vegas as if it were an adult amusement park by recreating the physical elements of Vegas as these exaggerated structures that grow out of the Nevada desert. The audience could traverse the physical structures in a thrilling race reminiscent of ’80s action movies.
Signalnoise’s style is best described as “vibrant retro Tron,” which we brought to life in Cinema 4D. The boldness of the art direction made the experience visually striking and an instant audience favorite. To complete the production, we hired Whitenoise Lab to compose a unique ’80s soundtrack that complemented the art style, pace, and excitement of VEGAS RISING.
It became a road show once it was all done: We partnered with Samsung and took it to seven cities around the globe, bringing this digital and physical experience of Vegas to all these different places.
This was sort of my first “A-ha, I think we’ve found a new medium” moment. Because as I was saying before, all the different aspects of the project could be considered individual executions, but when you bring it into a larger holistic execution, we immerse people with storytelling and the emotional connection they form with the brand. And maybe they come out of it and say, “You know what? I do want to go check out Vegas.”
How can immersive experience design be applied to physical spaces, such as retail stores?
Grech: This is near and dear to me because it’s what I’m devoted to right now: rethinking the retail experience. How do you sell the environment or the destination first, and then have the product act as the narrative device that gets you there?
My design team examines all of our projects using four principles. We look at it from an emotional connection: How are we creating the narrative device for the brand, product, or service? We also want to create a sense of transformation, either from a psychological, physical, or consumer aspect. There needs to be a sense of utility and purpose behind all of it. And finally, of course, community. We want to create shared experiences where people are able to engage other people and their ideas.
I didn’t work on the project, but Canada Goose’s Elements Room is a good use case for immersive retail experiences. We all know that their jackets keep you warm just by their cultural influence and popularity. But if you don’t own one, you don’t quite get it. The project’s main retail showroom space created an atmospheric experience using an array of video projections showing nature footage from cold environments—and real snow. You could feel the cold, which made the company’s super-warm—if expensive—coats a more compelling purchase.
Noone: Designers should look for ways to incorporate the physical space into the experience. For example, I was thinking about silent discos and the reason why they never really took off as an “immersive” experience. And it’s because people want social connection when they’re in a physical space. And in silent discos, you’re cut off from the people around you: You can see them, but you can’t hear them. So if you’ve got multiple people in a physical space or with VR headsets on, you have to connect them somehow.
A good example I tried out recently was a VR version of laser tag where you go into this big warehouse in teams of 10 people, each with a VR headset on. The reason why it works well is that the internal architecture of the warehouse—every wall, floor, ceiling, and obstacle—is mapped and matched perfectly to an extraterrestrial VR environment. Obviously, your brain knows you’re not in reality, but the VR world is quite immersive. If I see a wall, I can reach out and touch it.
How does immersive design differ when applied to purely digital experiences, such as websites and apps?
Noone: Just because an experience is boxed in on a small screen doesn’t mean it’s going to be any less immersive, or any less captivating. Pokémon GO blew everyone’s minds when it first came out, and people are still playing it.
When creating an immersive digital experience, you bring along the same UI/UX principles that you generally think about when you’re designing for mobile. Catching a Pokémon needs to be intuitive. It’s just another app, at the end of the day, and you have to strike a fine balance between teaching users new methods of interaction like hand tracking and relying on interactions they’ve been trained on for the last few decades, like point and click.
Grech: I agree with everything Clarke said 100%. The beauty about AR is that it’s so widely available. All of our smartphones are AR ready. Whether it’s an immersive concert or Pokémon GO, it’s easy to provide people with access to an immersive experience, and the UI methodology is the same.
Noone: I’m currently in discussions with a music platform for DJs about creating an immersive version of its website. So if someone was to visit its 2D website on a Meta Quest headset, there would be a button in the bottom right-hand corner that would say, “Do you want to view this website in XR [extended reality]?” And it turns that 2D website into 3D. But it doesn’t necessarily mean that they’re just going to be looking at a 2D screen inside a virtual sphere—we break what is essentially a 2D screen out into 3D components. For example, if it was the Canada Goose website, what was previously just 2D images of the jackets would then pop out into 3D versions of the jackets that you could spin around and view from all angles.
What are your thoughts on engaging multiple sensory elements in immersive designs?
Grech: Immersive (or spatial) audio is becoming such an important factor. This is not something you can just grab clips of and drop in. It’s something that has to be delicately curated to fit a specific environment. That’s as important as creating the visual content.
There’s also the question of incorporating scents. No one’s quite cracked that yet. Aromatherapy machines can have a variety of different smells that can be digitally activated. It’s great at the beginning, but then they all blend into each other, and it smells like garbage. It’s an interesting challenge to try to replicate smells for different environments, but there isn’t a proper solution for it yet.
Noone: The DJ project is obviously very audio-focused. We’re creating a mixed-reality experience where DJs can come in and have two turntables, a mixer, and a library that they can select their music from, and essentially perform with. We’re working with the Meta SDK, which is full of all kinds of magic: hand tracking, face tracking, full-body tracking, gestures, you name it. You start thinking, “I’ve got all these things to choose from, should we just do them all?” Obviously you can’t, and you definitely shouldn’t.
For example, we got really amazed by the SDK’s voice control aspects. For any other app, this would be a nice feature. But as you can imagine, the last thing you want to hear while music is playing is the DJ’s voice dictating commands. So we had to back off of that, and we ended up using hand gestures instead: air swipes to move through the music library, volume up, volume down, play, all this kind of stuff. You essentially don’t need to touch any kind of UI, and it creates this dance while the DJ is performing, so it works well.
Noone: I’ve created my own custom GPT language model, which is a persona of a Unity developer. I use it for 90% of the Unity scripting for my projects now. On the DJ app that we’re currently building, I prompted it to build me an audio-reactive lighting rig in Unity. Imagine a DJ booth with all the different lighting and laser fixtures that all go in time to the music—I obviously know what I want it to look like, but for me to try and even start thinking about how to code it would be just a no-go. But in a matter of two days, the AI tool built this whole lighting rig that’s triggered by the left and right channels of the master audio, and it’s incredible.
AI in design is a bit of a double-edged sword. I’ve been working with a Unity developer for seven years now, and he’s amazing—one of those guys you can throw anything at and he’ll figure out how to do it. And I hate to say it, but I’m giving him a lot less work now. Using AI has shortened my development time by weeks, if not months.
Grech: I can say the same thing. We mostly use Unreal Engine, which has its own ChatGPT support, and it’s increased our developers’ productivity. We’re also exploring AI to help make 3D modeling and environmental design tasks more accessible through voice prompts. I can have an articulated conversation and see it being developed in a virtual 3D environment that I can explore. So designing 3D environments is no longer solely a task for 3D environmental artists.
How do you ensure that your immersive designs are inclusive and accessible to a diverse range of users?
Noone: We’ve definitely been thinking about this in the DJ app. As a DJ, you are typically in a standing position with your equipment around you. We’ve designed it like a 360-degree environment, so everything is at arm’s reach. Now, it would be very easy to fix that virtual equipment into position and expect that everyone who puts the headset on is going to be the same height, or that they’ll be standing at all. Obviously that’s not the case, so we’ve built handles into the mixing desk and turntables so you can move them up and down. For example, if you’re in a wheelchair and you’re in a sitting position, you can move those pieces of equipment right down to your lap. You can spin them around, arc them, bring them closer, whichever way works for you. And because hand tracking won’t work for people with only one hand or other mobility limitations, we’ve also ensured that every interaction that you need to enjoy the experience can be done with physical controllers.
Grech: This is also where A/B testing is important, to know that you’re on the right track. You look at it from the traditional UX exploration approach, where you’re in that feedback loop until you’ve solved all the necessary problems. It’s important to not rush that. It’s not just about asking, “What color is the call-to-action button?” or “Is it responsive for mobile?” When we’re now talking about three-dimensional space, whether it’s virtual or physical or both, all these other factors come into play that could exclude people or potentially become dangerous if not done right. This is another area where having people from multiple disciplines involved is essential. As a designer, I’m not an expert in architecture in terms of safety protocols, but when I work in that space, I need to consult somebody who is. In this way, designing immersive experiences is more challenging, but then at the end of the day, it becomes this beautiful experience that web can never come close to.