What Is Extended Reality?
What Is XR?
The difficulty with the definitions comes from the “X.” Depending on who you ask, it might not stand for anything. Some people use it as a placeholder, like a variable in a math problem. Some even pronounce “XR” as “X Reality.” Others use XR not to mean “any reality” but to mean “all reality” for example to discuss immersive technology generally rather than one at a time. People in this camp are more likely to say “XR” as “Extended Reality.” People have their preferences between the two “XR” uses but both can be handy in different situations depending on what you’re talking about. A lot of companies getting into immersive activations want to do it because they’re flashy. They might know that they want to do something with immersive technology but might not know whether they want to use AR, VR, or MR. Here the first use, X Reality, can be fitting because they’ll only use one form of immersive technology but they don’t know which one. A lot of academics, journalists, and technologists use “XR” as “Extended Reality” because they’re not just talking about AR or MR or VR – they’re talking about all of these technologies at once. This use is particularly helpful when talking about solutions like Varjo Reality Cloud which operates more like AR for an on-site user and more like VR for remote users. So, what are the differences between the other R terms? Why might it or might it not be important to specify how they are being grouped?The “-R” Abbreviations in Immersive Technology
VR, AR, MR – in all of those familiar abbreviations the “R” stands for “reality” and that’s true for “XR” as well. But, with XR being an umbrella term, it’s easier to understand if you also have a firm grasp on the other Rs as well.AR – Augmented Reality
Augmented reality places virtual elements into a user’s view of their physical surroundings using a camera and either a transparent lens or a live-view of a camera feed often through a mobile phone. Most modern virtual reality headsets have a similar function called “passthrough” but this particular technology is still largely experimental except on professional-grade devices. The virtual elements in augmented reality activations aren’t usually responsive – they add value to the user’s surroundings, or the user’s surroundings add impact to the virtual elements. For example, in the AR lookbook that ROSE developed with KHAITE, users could see models walking in their actual surroundings or view virtual representations of items in their own homes.MR – Mixed Reality
Mixed reality is similar to augmented reality in that it all starts with the user’s environment. However, the virtual elements in an MR experience are much more intelligent and interactive. They may interact believably with one another or with the environment. They may also collect and display information on the environment from connected devices or onboard sensors. Mixed reality requires a lot more computing power both to drive the interactive virtual elements and to display them in a meaningful way. As a result, most mixed reality experiences are made available exclusively on dedicated mixed reality devices like Magic Leap or Microsoft’s HoloLens. GigXR’s Insight series with ANIMA RES uses a HoloLens headset to display detailed and interactive anatomy models in a healthcare and education solution. If more than one person has a Hololens they can both join that session, or one presenter with a headset can stream or record a session to remote users without access to headsets.VR – Virtual Reality
Virtual Reality is entirely virtual. The user’s natural field of view is entirely replaced by computer-rendered settings and elements, potentially including other users represented as avatars. That doesn’t mean that everything in a VR experience has to be built from the ground up. For example, products like Microsoft Mesh can place a live volumetric capture of an individual within a virtually constructed environment. Similarly, some VR experiences take place within 3D images or videos. VR is popular in gaming and social applications but is also used for remote collaboration, design, and training. In fact, 3lb XR and 3lb Games design enterprise training simulations and other solutions as well as games, cross-pollinating one another to make intuitive and immersive enterprise solutions as well as fun and challenging entertainment experiences.The Acronym of Possibilities
Whether XR means one unspecified form of immersive technology or all forms of immersive technology together depends on who uses it and in what context. It’s also one of those terms that end users of the technology don’t really use at all – it’s primarily used at a relatively higher level of discourse. With this knowledge, you’ll probably be able to tease out what someone means when they say “XR” and if you don’t it’s okay to ask them to clarify. This is an emerging technology with an emerging dictionary of terms and everyone being on the same page is more important than appearing to understand nuanced specialist terminology.Why is AR So Appealing to Marketers?
Augmented Experiences – Not Augmented Ads
Augmented Reality (AR) technology uses digital elements superimposed over a user’s live camera feed. Because most modern smartphones can run most AR experiences, just about everyone has access to AR content. That’s a powerful tool for companies looking to grow their brands. “Currently, for brands using AR to sell goods, it is quite common to use the technology to digitally place real-world items in the user’s environment. While this is a good application of the technology, brands would be mistaken to stop just there. With a bit more imagination AR can be used to create an experience that has a much more emotional impact on the user.” Because AR relies on the view of the user’s physical surroundings, including the objects, people, and settings that are meaningful to them, AR experiences are inherently personal in a way that no other medium is. This is helpful because a strong brand isn’t just about “stuff” either. Bridging physical and digital experiences can help to convey values that aren’t just material. Often, the most successful branded activations aren’t about selling things at all. Rather, experiential AR is about communicating with people on a personal level by letting them explore the world around them through the window of augmented reality.Experience Something New
Because AR is an emerging technology, we can limit ourselves by thinking about it as strictly a way to experience futuristic applications. AR can also allow users to put themselves in the past or experience another place as it is today. For brands that have long histories or a far reach, this can be a surprisingly impactful way to engage your community.Step Into a Memory
Martin was an iconic sitcom that ran from 1992 to 1997. The show opened to cast members posing and dancing infront of solid color backgrounds and their names in the show’s memorable font. This unique and memorable piece of television history was begging to get an AR twist. In 2022, the surviving cast members reunited on BET+ to celebrate the 30th anniversary of the show’s premiere. At the event, visitors had access to a screen where they could dance and pose to have the magic of AR place them into the show’s familiar opening. They could then keep and share the clips, or edit them together to make their own show openings. Fans of the series were already excited to be at the reunion, which was its own piece of Martin history. However, the AR experience allowed them to be more than viewers of an event. They were able to participate in the show’s history in a unique and memorable way.Visit Miami Without Leaving Home
“Priceless” is a promotional initiative for MasterCard holders, giving them access to membership perks including online experiences. These experiences are increasingly taking place in Augmented Reality. MasterCard recently worked with ROSE and 8th Wall to create an AR tour of artwork in Miami’s Design District. In this window to another world experience, the user’s phone became their ticket to a guided tour of the renowned art installations. Touchscreen navigation even allowed Priceless members to move around the artwork to see them from any angle in their 360-degree virtual view just as they would if touring the Design District in person. A plane ticket to the same experience in person would have been a hefty gift from MasterCard and a hefty commitment from card holders, but the AR experience was achievable for both.See Yourself Differently
In the Martin example, the background was all that was augmented and the people stayed the same. However, AR filters and lenses – the joy of modern social media – can help viewers see themselves in new ways as well. Using AR for social media marketing is also a good business strategy. Social media users use the platform to share their lives with their friends as well as to share in the lives of their friends. A well-designed AR experience can bring viewers into your brand but viewers are also more likely to share their experiences with their own followings.Enter the Maddenverse
Clothing company Steve Madden already has a strong conventional social media strategy, which encourages customers to tag the company in social media posts that feature themselves wearing Steve Madden apparel. The company can then feature these customers’ user-generated content on its own social media platforms and website, both of which provide purchase options. In 2021, the company decided to get more immersive in their social media campaigns and launched “the Maddenverse.” For one activation, the company worked with ROSE to produce an AR filter for Instagram that turned user selfies into avatars of Steve Madden models. Users were again encouraged to share the images and tag the company’s profile. Like the Martin experience, this Maddenverse activation didn’t cost any money for users or make any money for the company. That wasn’t the point. Rather, the experience allowed fans to express their brand support in a new and fun way, growing their loyalty to the brand while also encouraging them to put the brand in front of their own social media followings. In just one week, almost 18,000 people used the filter to create personalized AR images of themselves in the Maddenverse. The users sharing those images resulted in a total 675,000 impressions in the first week. This illustrates the kind of scale that using AR for social media marketing can achieve when users are encouraged to share their creations with others.Give Your Audience Whatever They Want
Customers aren’t just customers anymore. They can be your audience, but they can also be creators working in a sort of partnership as casual ambassadors for your brand. This has huge potential, but it will only work if you cultivate a meaningful relationship with them. Experiential AR and using AR on social media can help to remind your audience why they’re passionate about your brand and it can allow them to express that passion to others. But it may mean rethinking what you want to give your community and what your community wants from your brand, other than just a purchasable product.Giving the Gift of AR this Holiday Season
Isn’t VR cooler?
There’s a lot of hype around virtual reality right now – and with good reason. However, virtual reality (in addition to requiring more robust hardware) means that everything is digital. That means that everything has to be created. Items, landscapes, maybe even representations of other users. That takes a lot of time, effort, and money. Augmented reality primarily uses a person’s physical surroundings, with a couple of changes brought to you by creative technologists. That means that a single item, character, or special effect can create a completely unique experience without needing to reinvent the wheel – and everything else – on a computer.AR: The Gift That Keeps on Giving
What’s more than all of that, AR draws on the viewer’s connection to their physical environment. It uses computer magic to bring a little something extra to the way that they experience the places, items, and even people that they already love. That brings us back to using AR to solve tricky problems on your holiday gift list.Give Something Truly Unique
Everything experienced in AR is completely unique to the viewer because what is going on in the camera feed is going to be different every time. No matter how special the experience is, the physical setting where the user chooses to launch that experience makes it even more personal and meaningful. ROSE created a virtual model of the real-life Edmund Pettus Bridge for an educational AR experience that viewers could visit from anywhere in the world. Some chose to go through the experience wherever it was convenient or practical for them. But, users can also choose to place the experience in an area that has emotional significance to them. A complete experience may be difficult to give as a gift. It is possible for you to create a one-of-a-kind AR item. That could be an object or character that only exists in the digital world. It could also be a 3D model of a physical object with a special significance to the friend or loved one to whom you present it. The great thing about digital objects is that they don’t have to exist in one format. While you might choose a special experience for the initial gifting, consider giving the file of the object itself as part of the gift. That way, the receiver can take their digital object or character with them into other virtual worlds and digital experiences.Give Something Physical – but Augment it
Some augmented reality experiences originate in the digital world and project out into the environment, like the digital objects that we were just talking about. Other augmented reality experiences start with a physical object that computer magic only enhances. In this way, you can give a “normal” gift that stands out a lot more. Patrón’s digital wrapping project took a bottle and some care to create a magical holiday gift Gifters created a personalized virtual wrapping for a Patrón bottle, including photographs, text messages, and other AR customizations. As a result, the end gift wasn’t “just a bottle of liquor,” it was a meaningful and personal one-of-a-kind experience – through the magic of AR.Get Really Creative
Some AR gifts combine everything that we’ve talked about: a digitally-enabled personal experience, a virtual object, and a physical object with augmented value. The adidas DEERUPT sneaker launch involved a physical box that appeared empty. Inside of that box was a grid that served as a target for a social media-friendly AR version of the shoe. This allowed fans to enjoy a product “unboxing” before the shoe was physically available. Giving a gift like this allowed a special early opening of a product naturally followed by the object itself. It’s not every day that a company does something like a virtual unboxing. However, you can apply this idea to your own gifts. Give someone a marker that launches an AR experience, even a simple one, while the “real” gift is something much bigger. That could be an item that hasn’t arrived yet, a trip someplace special, anything that you can think of. You can also use AR to let your friend or family member choose their own gift. Fashion brand KHAITE partnered with ROSE to bring models and fashions into a user’s home using augmented reality. Users got to see a personalized fashion show in their own chosen environment – and then had the option to buy the fashions that they viewed.Think Outside the Box
This article has provided a few ideas and a few examples. But, no article could capture all of the possibilities that AR presents for gift giving. In part, that’s because AR allows us to think outside of the box – or any other physical constraints. So, let your imagination run wild. Freely available AR object and experience building platforms are proliferating but still require a certain amount of skill. So, this article has included links to sites that you can use to have an expert help you create a digital item or experience of your own. You can also keep an eye out for ready-made experiences from brands who are increasingly using AR in creative ways.Off with their heads: Body and clothing segmentation in Spark AR
What is Segmentation?
Segmentation is the process of identifying a body part or real object within a camera feed and isolating it, creating a “cutout” that can be treated as an individual object for purposes like transformation, occlusion, localization of additional effects, and so on.Types of Segmentation:
Hair Segmentation: Changing a user’s hairstyle requires precise segmentation of that user’s real hair so that it can be recolored, resized, or even removed from the rendered scene and replaced entirely without affecting other parts of the scene, such as the user’s face. Body Segmentation: Allows for the user’s background to be replaced without tools like a green screen, throwing the user into deep space, lush jungles, the Oval Office, or anywhere else you would like to superimpose your body outline against. Skin Segmentation: Skin segmentation identifies the user’s skin. This could power an experience in which a user wears virtual tattoos that stop at the boundaries of their clothes and move along with their tracked body parts — almost perfectly lifelike. Object Segmentation: Gives us the ability to perform occlusion so that AR objects might be partially hidden under or beneath real ones as they would logically be in reality, or even to “cut and paste” those real objects into virtual space.Achieving Segmentation
How do we achieve segmentation? Approximating shapes from a database would never be even close to realistic. Identifying boundaries by color contrast is a no go for people with hair or clothes that are close to their skin tone. Establishing a body position at experience start (“Strike a pose as per this outline:”) and then tracking changes over time is clunky and unreliable. We need something near-instantaneous that can recalibrate on the fly and have a wide margin of approximation for adjustment. We need something smarter! Of course, then, the answer is artificial intelligence. These days, “AI” is more often than not a buzzword thrown around to mean everything and yet nothing at all, but in this case we have a practical application for a specific form of AI: neural networks. These are machine learning algorithms that can be trained to recognize shapes or perform operations on data. By taking huge sets of data (for example, thousands and thousands of photos with and without people in them) and comparing them, neural networks have been trained to recognize hands, feet, faces, hair, horses, cars, and various other animate and inanimate entities…perfect for our use case.Training a neural network to identify objects and remove backgrounds. Credit to Cyril Diagne, 2020.
Render streaming: taking AR to the next level
What’s the deal with AR, anyway?
XR technology is widely touted as having infinite potential to create new worlds. You can design scenes with towering skyscrapers, alien spacecraft, magical effects, undersea expanses, futuristic machinery, really anything your heart desires. Within those spaces, you can fly, throw, slash, burn, freeze, enchant, record, create, draw and paint — any verb you can come up with. The only limit is your imagination!Painting in VR with Mozilla’s A-Painter XR project. Credit: Mozilla 2018.
Sounds cool. What’s the problem?
Well, all of that is true — to a point. Despite all of our optimism about this AR and VR potential, we find that we are still bound by the practical limitations of the hardware. One of the biggest obstacles to creating immersive, interactive, action-packed, high-fidelity XR experiences is that the machines used to run them just don’t have the juice to render them well. Or, if they do, they’re either high-end devices that have a steep monetary barrier to entry, making them inaccessible, or too large to be portable and therefore inconducive to the free movement you would expect from an immersive experience. That’s not to say that we can’t do cool things with our modern XR technology. We’re able to summon fashion shows in our living rooms, share cooperative creature-catching gaming experiences, alter our faces, clothing, and other aspects of our appearance, and much, much more. But it’s easy to imagine what we could do past our hardware limitations. Think of the depth, detail, and artistry boasted by popular open-world games on the market: The Elder Scrolls: Skyrim, The Legend of Zelda: Breath of the Wild, No Man’s Sky, and Red Dead Redemption 2, just to name a few. Now imagine superimposing those kinds of experiences against the real world, augmenting our reality with endless new content: fantastic flora and fauna wandering our streets, digital store facades that overlay real ones, information, and quests available to learn about at landmarks and local institutions.Promotional screenshot from The Legend of Zelda: Breath of the Wild. Credit: Nintendo 2020.
There are many possibilities outside of the gaming and entertainment sphere, too. Imagine taking a walking tour through the Roman Coliseum or Machu Picchu or the Great Wall of China in your own home, with every stone in as fine detail as you might see if you were really there. Or imagine browsing through a car dealership or furniture retailer’s inventory with the option of seeing each item in precise, true-to-life proportion and detail in whatever space you choose. We want to get to that level, obviously, but commercially available AR devices (i.e. typical smartphones) simply cannot support them. High-fidelity 3D models can be huge files with millions of faces and vertices. Large open worlds may have thousands of objects that require individual shadows, lighting, pathing, behavior, and other rendering considerations. User actions and interactions within a scene may require serious computational power. Without addressing these challenges and more, AR cannot live up to the wild potential of our imaginations.So what can we do about it?
Enter render streaming. Realistically, modern AR devices can’t take care of all these issues…but desktop machines have more than enough horsepower. The proof is in the pudding: we see in the examples of open-world video games previously mentioned that we can very much create whole worlds from scratch and render them fluidly at high FPS rates. So let’s outsource the work! The process of render streaming starts with an XR application running on a machine with a much stronger GPU than a smartphone (at scale, a server, physical or cloud-based). Then, each processed, rendered frame of the experience, generated in real time, is sent to the display device (your smartphone). Any inputs from the display device, such as the camera feed and touch, gyroscope, and motion sensors, are transmitted back to the server to be processed in the XR application, then the next updated frame is sent to the display device. It’s like on-demand video streaming, with an extra layer of input from the viewing device. This frees the viewing device from actually having to handle the computational load. Its only responsibility now is to stream the graphics and audio, which modern devices are more than capable of doing efficiently. Even better, this streaming solution is browser-compatible through the WebRTC protocol, meaning that developers don’t need to worry about cross-platform compatibility, and users don’t need to download additional applications.Diagram of render streaming process using Unreal Engine. Credit: Unreal Engine 2020.
There is just one problem: it takes time for input signals to move from the streaming device to the server, be processed, and have results be transmitted back. Nor is this a new challenge; we have long struggled with the same latency issue in modern multiplayer video games and other network applications. For render streaming to become an attractive, widespread option, 5G network connectivity and speeds will be necessary to reduce latency to tolerable levels. Regardless, it would be wise for developers to get familiar with the technology. All the components are already at hand; not only is 5G availability increasing, but Unity and Unreal Engine have also released native support for render streaming, and cloud services catering to clients who want render streaming at scale are beginning to crop up. The future is already here — we just need to grab onto our screens and watch as the cloud renders the ride.How AR Brought KHAITE’s Latest Fashion Line Directly To Consumers
What We Did
As the fashion world had to adapt and move to a purely digital landscape — fashion shows had to be pushed to video, new clothing lines had to be shipped to prospective buyers — brands had to move quickly to break through all of the noise. ROSE and Chandelier Creative helped KHAITE bring their newest collection to life. With emerging technology ROSE was able to bring KHAITE’s footwear designs to the homes of their customers, buyers and the market giving customers a deep visual experience unlike any other fashion brand has been able to accomplish. As the world continues to grapple with these unprecedented times, this technology will become a cornerstone of how fashion powerhouses market their designs to their customers. ROSE decided to build a WebAR application for accessibility purposes and to take the burden off consumers. The WebAR experience is widely-supported, deeply interactive and highlights the unique details of KHAITE’s footwear designs in a way that offers endless creative freedom for the user. KHAITE shipped lookbooks that had QR codes embedded within the experience, made by Chandelier Creative, that when scanned take you to the AR experience where users can see the shoes to scale in their own homes. This allows consumers to tap whichever shoes they’d like to get a closer look at and place them in their homes. This allowed customers to get a feel for the items without being able to see them in person. This experience allowed KHAITE to create a visual experience that otherwise would only exist inside one of their showrooms. In the second iteration of the experience, for KHAITE’s pre-fall 2021 collection, ROSE expanded the experience to include models rendered in augmented reality, allowing for users to be able to see the clothing in the way it was meant to be seen. While still using WebAR, this second experience utilized green screen video to build a full runway show with models wearing the new line as they walk up and down whatever environment the user chooses.Challenges
Understanding the mathematics of 3D space is a learning curve in itself, but creating an experience accessible in a browser, as opposed to a native mobile application, makes things even more difficult with issues like sensor permissions and browser compatibility. Adding light sources to a scene requires a careful balance between the existing, real-life lighting observed by the camera and computed lighting that best accentuates the highlights and shadows of the models in the AR scene. This challenge was multiplied tenfold as we created specific lighting setups to complement each unique shoe model. The material of each model was a major consideration; a shoe with a soft, quilted insole and white leather straps needed soft, glowing illumination, whereas a black patent leather boot needed bright point lights that played off the glossy reflectivity of the material. The end result was an experience that tailors to each model, allowing users to see each one in its best light. When we started on the second KHAITE experience, we ran up against challenges that came with showcasing an entire clothing line. KHAITE is a premium brand which places a lot of emphasis on the quality and texture of the materials for their garments and accessories. WebAR is a resource constrained medium, meaning lower-file sizes and compression are required. Capturing 4K, high framerate and high-quality assets for delivery via the web is a challenge. Involving models and movement increases the difficulty of capturing high quality assets. Thankfully, we were able to get incredibly high quality green screen footage enabling the quality of the looks to shine through.Impact
As the fashion world grapples with how to convert sales and stay afloat amid the pandemic, finding ways to integrate experiences with seamless shopping capabilities is now the only viable option. Within this experience, the sales were proof enough that this execution works for high-fashion labels. Fashion is a tactile and textured experience, and amid social distancing brands have hurdles to jump to create moving experiences for consumers. Companies are integrating new technology to bring fashion shows to people’s phones, computers, and inside their homes. For the first experience ROSE built for KHAITE, sales increased by significantly in just a few short weeks. Evan Rose, CEO and founder of ROSE, said, “We are proud to have partnered with KHAITE and Chandelier Creative to create an experience that changes how consumers engage with physical products in an increasingly digital world. We’re excited to be a part of driving how the retail and fashion industries engage with consumers.” As this current climate continues, and shoppers continue to have decreased consumer confidence, focusing on the clothes and the experience that can be had without in-person experiences are more important than ever. Using augmented reality for elevating fashion in this time of social distancing allows for a rich, interactive experience for all users and customers. AR allows for the color, texture and life of garments to come to life.ROSE And Patrón Partner To Build The Spirit Industry’s First User-Generated AR Experience
Amid a global pandemic the solutions to some of our most basic problems need some creativity. With COVID’s continued presence in our lives, social distancing may have to continue into a time that is usually filled with parties, family gatherings and holiday festivities. People will be looking for ways to make new traditions, and to connect with their loved ones from afar.
Patrón needed a way to help customers connect despite holiday plans shifting across the country, while also maintaining their brand narrative. We worked with Patrón to create a first-of-its-kind digital wrapping as a special gift this holiday season, and beyond, to solve this specific problem. This experience provides a sentimental and original take on gifting alcohol as well as gives customers first-hand experience not just using augmented reality, but harnessing it to make something themselves.
How Does It Work?
Gifters of Patrón can use a microsite developed by ROSE to create a custom wrapping including a photo, text, and stickers that will transform into a 360-degree augmented reality (AR) gift wrapping around their Patrón bottle. This gives customers a chance to use this emerging technology in a new way that hasn’t been available in retail before.
“With COVID-19 impacting most celebrations this holiday season, we wanted to give customers a way to continue to celebrate with each other while social distancing,” Nicole Riemer, the art director on the project said. “By creating a custom wrapping, customers can take the act of gifting alcohol from an easy to a thoughtful one. During a time when you might not be able to gift in person, creating a custom wrapping with photos, stickers, and text provides that personal touch that is missing from not being able to gift it in person.”
Using WebGL in both 2D and 3D allows users to see their content change between dimensions in real time. Gifters can then use built-in recording and sharing technology to share the gift with the recipient as well as on social media.
By providing customers the ability to customize their gift of Patrón for both different occasions and gift recipients, we are showing them that Patrón isn’t the “mass brand” they think it is. This virtual gift allows distance to not be a barrier in creating something thoughtful that nurtures customers’ need for growing and maintaining their relationships.“Creating these designs digitally allows for the process to be instantaneous and affordable, rather than waiting for something to get engraved or physically customized, without losing the ability to share that someone is thinking of you on social media,” Riemer said.
Why Use Augmented Reality?
Using augmented reality for this experience had several advantages. The most obvious one being that this experience provides a sentimental gift without having to enter a store or be in the same physical space as the recipient — helping maintain social distancing amid the pandemic. Additionally, augmented reality provides a way for users to generate their own content while maintaining the PATRÓN brand.
“The challenge with AR has always been figuring out how we can take new dimensions and connect them to the ones we’re familiar with in creative, expressive, and helpful ways,” Eric Liang, front-end/AR engineer on the project said. “The AR experiences that ROSE has previously created have each addressed that challenge by taking something important to us — something unseen or out of the ordinary that we wanted to showcase — and constructing it in the user’s world. This time, we’re handing the reins to the user. In this new collaboration, we’re letting users create and realize something that’s uniquely their own.”
Harnessing the power of AR will bring all the holiday cheer customers could be missing into the palm of their hand and inside their home — connecting people who want to be together this holiday. Additionally, PATRÓN has a history of creating limited-run packaging and bottles and this experience offers customers peak exclusivity with the ability to customize every individual bottle they purchase, so the virtual expansion of exclusive boxes was a natural progression for the brand.
Design Considerations
In designing this web application, we identified two different types of users. As Patrón’s target demographic for this experience is 21–35, we were less concerned with the technological literacy of the user. Additionally, since this started as a concept that would be mainly pushed through social media, we were bound to attract younger users that would already be at least slightly familiar with augmented reality from exposure through SnapChat and Instagram. After determining this demographic information for our target user, the next question was what a user would want to create when using this tool. This led us to determining the following use cases:
Creator 1: The user that wants to create a really thoughtful collage that they want the recipient to see that they spent time on. They expect that their gift will be shown to others and potentially shared on social media in a similar fashion to birthday posts.
Creator 2: The user that is looking to create a quick gift that still wows the intended recipient. They want to expend minimal effort, but get the same praise and reaction as someone who spends a lot of time on their creation.
In order to satisfy the need for a quick gift, we created quick “themes” that someone can choose from at the start of the experience that allows them to upload a single photo and have created a designed bottle in 5 clicks (including previewing their design). For those that want to spend more time on their creation, we provide the ability to start from scratch and choose the content that goes on every side of the bottle.
In choosing the predetermined content that users can apply to their digital bottles, we focused on a few things. The first was to choose assets that could be used for multiple occasions, holidays, and were non-denominational. The second was to underscore the socially distant benefit of this gift and continue to have people drink responsibly even when gatherings are not encouraged. The third was to make sure that the assets could be used in many combinations and still create a wrapping that looks high end.
Once we determined the user experience and the content types that could be placed on the wrappings, we had to find a way to map their content to a 3D bottle in real time, to show the user their creation on this model before sending augmented reality link to their recipient, and then ultimately render each individual experience in augmented reality.
How We Built This
The technical inspiration for this experience began in an understanding of how WebXR, the implementation of augmented reality in a web browser, operates. WebXR is the conceptual model of everything that exists in an extended reality scene: where each virtual object is, where light is coming from, where the “camera” stands and observes, how the user interacts and changes all of these things, and so on. Imagine closing your eyes and understanding where everything around you in the room is: your desk, the floor, a lamp, rays of sunlight coming through a window, even your own hands. Now open your eyes and actually observe those things. That’s what WebGL does. WebGL is the graphics engine that takes the theoretical model processed by WebXR and paints it on a screen, rendering the virtual existence of matter and light into visibility.
While we wanted to capture the same magic of seeing something you create exist in 3D space, it was important that it would be accessible to everyone, both in terms of the technology and creativity. We wanted it to be usable from an everyday mobile device, without the need for expensive VR technology. We also didn’t want to require the user to be a painter, have an empty warehouse to dance around with VR goggles on, or have an intricate understanding of 3D sculpture or set design to maximize the reward of the experience.
There were a lot of moving parts that needed to be addressed. There needed to be a simple, intuitive interface for the user to customize their design and we needed to apply the design to a 3D model composed of a number of different materials and textures, from soft cork to clear pebbled glass to shiny metallic gift wrap. The experience needed to show that customized bottle back to the user in an interactive, attention-grabbing 3D experience. And finally, we needed to be able to scale the experience for a mass marketing campaign, which meant preparing for a large number of concurrent users with different devices and intents. We settled on technologies to address each of these challenges: a React/HTML Canvas microsite to design the wrapping, an 8th Wall/A-FRAME experience to view it, and a serverless API backend with cloud storage to support scale.
The next step was to source a 3D model of the bottle and we worked with a 3D artist and modeller and iterated over the model until each detail was as accurate as possible, and then continued to optimize our renders.This involved adjusting lighting through trial and error until we found the best setup to illuminate the bottle and make the glass and its reflectiveness as lifelike as possible, as well as customizing the physical material shaders for each node of the finalized model: the cork, the ribbon, the glass, the liquid, and the wrapping.
3D model renderings of the Silver Patrón bottle.Later on, we also realized that we needed a dynamic approach to the wrapping’s transparency. If the user chose to lay their graphics directly over the glass without using a background, those stickers, photos, and text would need to be opaque while leaving the glass transparent. The answer was taking the texture maps we generated with each user-created design and filtered them into black and white, effortlessly serving double duty as alpha maps to control transparency.
Example of an alpha map.While the experience would be accessible to everyone, we wanted those who had a Patrón bottle handy to be able to integrate it into the experience. It’s not yet feasible to use a real-life bottle of Patrón to anchor the experience, so we looked outside of the box — and settled on the actual box that each bottle of Patrón comes in. This gave us the opportunity to leverage 8th Wall’s image target feature, using Patrón bottle image on the side of each box to trigger the dramatic emergence of the virtual bottle from the physical box.
Built to share on social, this augmented reality experience allows for recording within the WebAR experience.Those without a box can watch the bottle appear on the plane they have placed it on in the experience. Adding some typical controls like pinch to zoom and finger rotation made it easy for the user to examine the bottle and the details of the design, and we added in 8th Wall’s Media Recorder capability to further boost the shareability of the experience.
Conclusion
As companies look ahead to a greener and more sustainable future, the concept of virtual wrapping and virtual packaging is likely to expand. As augmented reality moves from an emerging technology to an adopted one, user-generated AR content will take center stage, and experiences like this one will enable every day users to create using AR technology. As all industries grapple with how to stay competitive, and stay afloat, innovation is the answer to moving forward. This is the tip of the iceberg when it comes to what augmented reality can accomplish.
We are excited to continue innovating and bringing projects like these to life. We believe anyone can innovate and that process is vital amid the current economic landscape. Our passion for emerging technologies and augmented reality is immense and our work will only continue to reflect that. We’re looking forward to sharing more soon.
Credits:
Ashley Nelson: Concept and Strategy, UX Copywriter
Eric Liang: Front-end/AR engineer
Eugene Park: Experience Design
Leonardo Malave: Back-end/AR engineer
Marie Liao: QA Engineer
Nicole Riemer: Concept and Strategy, Art Direction, and Experience Design
Yolan Baker: Project Manager
Pay To Play: Visualizing Presidential Campaign Spending Using Augmented Reality
Believe it or not, a few short months ago the main event dominating the news cycle wasn’t coronavirus, but the Presidential election. The Democratic primaries were different from years past, and not just because the number of candidates running could fill a small football field. One thing that stood out to our team was the record spending that occurred this election cycle. Discussions began to swirl around campaign finance specifically when Michael Bloomberg entered the race, funding his entire campaign with his personal fortune, and raising questions about what money should and shouldn’t buy while running for office. We began thinking about a way to contextualize the immensity of campaign spending through the language we speak best — technology. Those conversations and the desire to use technology to answer that question was the origin of Pay to Play. Due to primaries being postponed, and the race being narrowed down to single candidates from each party, we considered not releasing this experience.
However, with the new economic pressures on American families due to coronavirus and the current volatile international economy, we believed the relationship between money and politics was worth exploring. This project considers the disconnect between the monetary impact of the political process and the needs of everyday Americans.
The staggering amount of money spent by Democratic candidates in the 2020 election left us wondering how that money could have been spent on infrastructure and funding the platforms that those candidates had as part of their campaigns. We designed Pay to Play as a way to look back on the record amount of money spent by Democratic candidates that have ended their bids. We also included how much several Republican contenders in the 2016 presidential election spent on their campaigns as another comparison.
We designed this experience to visualize our internal discussions and the conversations happening in the U.S. during this tumultuous time, and in doing so we wanted to answer the question: “What else could we have done with that money?”
How Does It Work
Pay to Play was developed using 8th Wall as the hosting platform. The only web specific AR toolkit, 8th Wall allows anyone with a mobile device and an internet connection to access the comparative experience. Users can compare campaign spending amounts from the top seven Democratic candidates who spent the most on their presidential run, as well as the top seven Republican candidates from the 2016 presidential election. The experience has different “common good” filters and each “common good” filter has been paired with a representative 3D object that will fill the space with the appropriate scaled number of objects. With each selection, the data will simultaneously update in the upper left corner.
Why Use Augmented Reality
Using augmented reality for data visualization allows for emotional reactions from the user. This experience showcases the immensity of campaign spending by using cascading scaled objects that fill the users view, as though it could overflow from the screen at any moment. This experience was created using 8th Wall, which meant decreasing file size and the number of objects rendered is important for optimizing load time. To speed up load time and allow for easier comparison, the number of objects was scaled. While AR can make data more manageable for users, it can also create emotional connections through hands-on participation with the product.
The Build, 3D Modeling, and Optimization
We found that the best way to offer an immersive extended reality experience, while still offering relevant information and options to a user, is to combine the XR portion with a heads-up display that lies on top without obstructing the view. As such, this project could immediately be divided into two parts: building the HUD and coding the 3D model portion. We used A-Frame, a 3D framework built on Three.js and HTML, to bridge the gap. By representing our 3D assets and behaviors as HTML, we were free to create our HUD in pure HTML and have it communicate and interact seamlessly with the A-Frame components.
We found that much of the challenge of this project was using AR in a way that was accessible to as many people as possible while still maintaining the core identity of the project — using numerical scale as a way to evoke a reaction from the user. Rendering any 3D model in a web browser can be an expensive operation. Rendering thousands of them would tax a smartphone’s hardware to the point of unusability. We ended up approaching this by leaning into the idea of scale: we didn’t need exacting detail if the idea was to overwhelm the user with a huge pile of items; we just needed enough to make it clear what each item was. So we selected simple models with fewer polygons, decimated their numbers of faces as low as we could, and reduced the resolution on their textures to minimize file size. The end result worked out — we had piles of apples that were clearly recognizable and deeply satisfying to watch cascade down from the sky.
Additional challenges came from the technologies we used to build the experience itself. Web AR platforms advance every day, but there are still severe limitations to their capabilities. For example, 8th Wall, the platform on which this experience runs, offers surface occlusion capabilities only for its Unity integration into native apps. For browser-based experiences that don’t yet have access to that plane detection technology, we have to emulate a floor by placing a vast invisible sheet at a defined distance below the camera. The distance to the “floor” is not dynamic and doesn’t change whether the user is sitting or standing, resulting in an imperfect representation of reality. This process only makes us more excited to see the next steps web AR will take, as the technology continues to improve and provide us with new and even more compelling ways to augment our reality.
Conclusion
The political process is often a complicated and convoluted one, and accessing data on campaign finance can be overwhelming. Conceptualizing how much candidates spend on their campaigns shows the immensity of American politics. By using AR, it becomes easier to visualize the power that the people funding these campaigns have, and raises real questions about the possibility of sweeping change if these funds were made available.
Credits:
Jordan Long: Concept and Strategy
Nicole Riemer: Art Direction and Experience Design
Eric Liang: Experience Design and Development