The Role of AR in RetailAugmented reality has a growing role in the fashion industry, from clothing design to completely virtual clothing. But, for the average shopper buying a physical garment, what good is AR? AR (displaying virtual elements in a user’s physical environment through the use of smart devices like phones) allows the shopper to get a fair understanding of a garment without having direct access to that garment. Throughout the rest of this article, we’ll talk about things like fashion shows viewable from anywhere, or trying on a clothing item before it’s even in the store. AR technology means viewers don’t have to travel to a fashion show to see the latest looks – they can do it from home. They can see what a clothing item would look like on them without going to the store, seeing if it’s in stock, and trying it on. These factors, and many others, reduce cost for manufacturers and even result in more satisfied customers less likely to return items they buy. It’s true, augmented reality might not let you see exactly how different light would dance off of a reflective bauble, or let you feel the material on your skin. At least, not yet. Still, it’s never been easier to get to know a garment without having it in your hands.
Virtual Fashion ShowsFashion shows are one of the industry’s standard methods of introducing the world to their new products. These are the first opportunities for people outside of the designer’s studio to see what a garment looks like on a person, how it flows and moves. However, conventional fashion shows are typically restricted to people in the fashion industry. Even if the average person could find the time and money to travel to one, they probably wouldn’t be allowed in the door. AR helps to bring fashion shows to the people. By replacing the catwalk with a capture studio, experienced designers can make high-quality virtual versions of fashion models. Apply a little techno-wizardry, and these models do their walks wherever a viewer points their mobile device. If the viewer is only interested in a few looks in the collection, they can view those fashions without sitting through the whole show.
See it in ActionBloomingdale’s created an AR fashion show for its 150th anniversary. ROSE created it using designs exclusively available for the celebration. The fashion show was visible in Bloomingdale’s stores, or in the homes of over 400,000 shoppers who received an AR-activated catalog. You can still view the experience by scanning a QR code on Bloomingdale’s website. Bloomingdale’s reported a 38 percent increase in shopper engagement and a 22 percent increase in conversions. A lot of those conversions were thanks to a click-to-buy feature that allows shoppers to purchase looks from within the experience just by tapping their favorite fashions. A similar activation by ROSE and KHAITE led to a 400 percent increase in sales. Further, users browsed an average of 16 looks spending over four minutes in the experience. An experience for Selfridges that was only available in-store caught an average 51 seconds dwell time. Compare that to the amount of time that shoppers spend looking at mannequins.
Becoming the ModelWatching a fashion show can be fun. But, what about being the model? ROSE worked with Steve Madden for an initiative in “the Maddenverse.” This time, it was an Instagram effect that turned users into a stylized Steve Madden virtual avatar. In this case, the idea wasn’t to realistically represent clothing that the viewer could actually buy but rather to give them a fun opportunity to engage with the brand itself. But that’s a topic for another day.
Virtual Try-Ons and SamplesSeeing an outfit on a model is nice, but when people buy an item, they’re going to care more about how that item looks on them. This can be trickier with augmented reality, but it’s possible. Creating a virtual version of a garment can allow a prospective buyer to see how it works with them. They can see how it matches other items in their collection, their skin color and makeup choices, their hairstyles, and even places where they anticipate wearing the garment like their home or their favorite restaurant. AR tryon in eCommerce may present another “hurdle” – it’s an extra screen tap – but many shoppers find it fun. Further, they’re statistically less likely to return items that they purchase after engaging with them in AR. In fact, ROSE worked with Adidas to create a virtual model of one of their shoes that buyers could “unbox” on social media before the physical shoe shipped. Creating a virtual clothing item can be a lot of work. But, increasingly, that work is already done. Designers and manufacturers frequently start with a digital model because it’s easier to see potential changes than with physical prototypes. In some cases, these design models can be adapted for virtual try-on cases.
The Future of Virtual Try-OnWe’ve discussed some things AR clothing doesn’t do too well. For example, faithfully replicating reflective surfaces, or the way that a fabric moves – or even how a garment will actually fit. Fortunately, all of these are aspects of the technology that are improving year-over-year. While you’ll (probably) never be able to feel an AR garment before you have the physical version, effects like how a piece of clothing reflects light in your environment are improving. In terms of fashion try-on, it was largely pioneered for jewelry in particular but we’ll likely see the approach extended to other materials as well. If you’ve tried virtual tryon before, you have probably had one or two unsatisfactory experiences. Either the jacket that you’re trying on doesn’t move at all, or those earrings that you’re trying on make it look like there’s an earthquake. These technologies are improving too, but there are two big hangups. These effects are powered by physics engines. Different platforms can use different physics engines, so it can be hard to get a quality experience without optimizing for each app and website that you want to publish on. Further, the more advanced the models and effects are, the heavier the experience is. To reach customers where they are, you’ve got to publish on the apps and devices they use. Sometimes that can mean making compromises. Fortunately, this is also being solved by the move toward cloud and edge computing that takes some of that burden off of a user’s device. After all, movie-quality virtual clothing couldn’t be real-time rendered too well within Snapchat. And even that tech is tricky to pull off realistically. That’s one of the reasons Hollywood loves costumed and masked characters so much. That kind of work is closer to VR than AR.
The Final Frontier of “Virtual Clothing”?Right now, we’re talking about virtual clothing as a way to drive purchases of physical clothing. However, in recent years, there has been growth in the idea of virtual clothing that stays virtual. That can mean clothing for avatars or virtual clothing that appears in photographs and videos on social media. Whether or not you might ever be interested in buying and selling digital clothing, breakthroughs in this field will likely help to improve the technology as it appears in other use cases as well.
Isn’t VR cooler?There’s a lot of hype around virtual reality right now – and with good reason. However, virtual reality (in addition to requiring more robust hardware) means that everything is digital. That means that everything has to be created. Items, landscapes, maybe even representations of other users. That takes a lot of time, effort, and money. Augmented reality primarily uses a person’s physical surroundings, with a couple of changes brought to you by creative technologists. That means that a single item, character, or special effect can create a completely unique experience without needing to reinvent the wheel – and everything else – on a computer.
AR: The Gift That Keeps on GivingWhat’s more than all of that, AR draws on the viewer’s connection to their physical environment. It uses computer magic to bring a little something extra to the way that they experience the places, items, and even people that they already love. That brings us back to using AR to solve tricky problems on your holiday gift list.
Give Something Truly UniqueEverything experienced in AR is completely unique to the viewer because what is going on in the camera feed is going to be different every time. No matter how special the experience is, the physical setting where the user chooses to launch that experience makes it even more personal and meaningful. ROSE created a virtual model of the real-life Edmund Pettus Bridge for an educational AR experience that viewers could visit from anywhere in the world. Some chose to go through the experience wherever it was convenient or practical for them. But, users can also choose to place the experience in an area that has emotional significance to them. A complete experience may be difficult to give as a gift. It is possible for you to create a one-of-a-kind AR item. That could be an object or character that only exists in the digital world. It could also be a 3D model of a physical object with a special significance to the friend or loved one to whom you present it. The great thing about digital objects is that they don’t have to exist in one format. While you might choose a special experience for the initial gifting, consider giving the file of the object itself as part of the gift. That way, the receiver can take their digital object or character with them into other virtual worlds and digital experiences.
Give Something Physical – but Augment itSome augmented reality experiences originate in the digital world and project out into the environment, like the digital objects that we were just talking about. Other augmented reality experiences start with a physical object that computer magic only enhances. In this way, you can give a “normal” gift that stands out a lot more. Patrón’s digital wrapping project took a bottle and some care to create a magical holiday gift Gifters created a personalized virtual wrapping for a Patrón bottle, including photographs, text messages, and other AR customizations. As a result, the end gift wasn’t “just a bottle of liquor,” it was a meaningful and personal one-of-a-kind experience – through the magic of AR.
Get Really CreativeSome AR gifts combine everything that we’ve talked about: a digitally-enabled personal experience, a virtual object, and a physical object with augmented value. The adidas DEERUPT sneaker launch involved a physical box that appeared empty. Inside of that box was a grid that served as a target for a social media-friendly AR version of the shoe. This allowed fans to enjoy a product “unboxing” before the shoe was physically available. Giving a gift like this allowed a special early opening of a product naturally followed by the object itself. It’s not every day that a company does something like a virtual unboxing. However, you can apply this idea to your own gifts. Give someone a marker that launches an AR experience, even a simple one, while the “real” gift is something much bigger. That could be an item that hasn’t arrived yet, a trip someplace special, anything that you can think of. You can also use AR to let your friend or family member choose their own gift. Fashion brand KHAITE partnered with ROSE to bring models and fashions into a user’s home using augmented reality. Users got to see a personalized fashion show in their own chosen environment – and then had the option to buy the fashions that they viewed.
Think Outside the BoxThis article has provided a few ideas and a few examples. But, no article could capture all of the possibilities that AR presents for gift giving. In part, that’s because AR allows us to think outside of the box – or any other physical constraints. So, let your imagination run wild. Freely available AR object and experience building platforms are proliferating but still require a certain amount of skill. So, this article has included links to sites that you can use to have an expert help you create a digital item or experience of your own. You can also keep an eye out for ready-made experiences from brands who are increasingly using AR in creative ways.
What is Segmentation?Segmentation is the process of identifying a body part or real object within a camera feed and isolating it, creating a “cutout” that can be treated as an individual object for purposes like transformation, occlusion, localization of additional effects, and so on.
Types of Segmentation:Hair Segmentation: Changing a user’s hairstyle requires precise segmentation of that user’s real hair so that it can be recolored, resized, or even removed from the rendered scene and replaced entirely without affecting other parts of the scene, such as the user’s face. Body Segmentation: Allows for the user’s background to be replaced without tools like a green screen, throwing the user into deep space, lush jungles, the Oval Office, or anywhere else you would like to superimpose your body outline against. Skin Segmentation: Skin segmentation identifies the user’s skin. This could power an experience in which a user wears virtual tattoos that stop at the boundaries of their clothes and move along with their tracked body parts — almost perfectly lifelike. Object Segmentation: Gives us the ability to perform occlusion so that AR objects might be partially hidden under or beneath real ones as they would logically be in reality, or even to “cut and paste” those real objects into virtual space.
Achieving SegmentationHow do we achieve segmentation? Approximating shapes from a database would never be even close to realistic. Identifying boundaries by color contrast is a no go for people with hair or clothes that are close to their skin tone. Establishing a body position at experience start (“Strike a pose as per this outline:”) and then tracking changes over time is clunky and unreliable. We need something near-instantaneous that can recalibrate on the fly and have a wide margin of approximation for adjustment. We need something smarter! Of course, then, the answer is artificial intelligence. These days, “AI” is more often than not a buzzword thrown around to mean everything and yet nothing at all, but in this case we have a practical application for a specific form of AI: neural networks. These are machine learning algorithms that can be trained to recognize shapes or perform operations on data. By taking huge sets of data (for example, thousands and thousands of photos with and without people in them) and comparing them, neural networks have been trained to recognize hands, feet, faces, hair, horses, cars, and various other animate and inanimate entities…perfect for our use case.
Training a neural network to identify objects and remove backgrounds. Credit to Cyril Diagne, 2020.
What’s the deal with AR, anyway?XR technology is widely touted as having infinite potential to create new worlds. You can design scenes with towering skyscrapers, alien spacecraft, magical effects, undersea expanses, futuristic machinery, really anything your heart desires. Within those spaces, you can fly, throw, slash, burn, freeze, enchant, record, create, draw and paint — any verb you can come up with. The only limit is your imagination!
Painting in VR with Mozilla’s A-Painter XR project. Credit: Mozilla 2018.
Sounds cool. What’s the problem?Well, all of that is true — to a point. Despite all of our optimism about this AR and VR potential, we find that we are still bound by the practical limitations of the hardware. One of the biggest obstacles to creating immersive, interactive, action-packed, high-fidelity XR experiences is that the machines used to run them just don’t have the juice to render them well. Or, if they do, they’re either high-end devices that have a steep monetary barrier to entry, making them inaccessible, or too large to be portable and therefore inconducive to the free movement you would expect from an immersive experience. That’s not to say that we can’t do cool things with our modern XR technology. We’re able to summon fashion shows in our living rooms, share cooperative creature-catching gaming experiences, alter our faces, clothing, and other aspects of our appearance, and much, much more. But it’s easy to imagine what we could do past our hardware limitations. Think of the depth, detail, and artistry boasted by popular open-world games on the market: The Elder Scrolls: Skyrim, The Legend of Zelda: Breath of the Wild, No Man’s Sky, and Red Dead Redemption 2, just to name a few. Now imagine superimposing those kinds of experiences against the real world, augmenting our reality with endless new content: fantastic flora and fauna wandering our streets, digital store facades that overlay real ones, information, and quests available to learn about at landmarks and local institutions.
Promotional screenshot from The Legend of Zelda: Breath of the Wild. Credit: Nintendo 2020.There are many possibilities outside of the gaming and entertainment sphere, too. Imagine taking a walking tour through the Roman Coliseum or Machu Picchu or the Great Wall of China in your own home, with every stone in as fine detail as you might see if you were really there. Or imagine browsing through a car dealership or furniture retailer’s inventory with the option of seeing each item in precise, true-to-life proportion and detail in whatever space you choose. We want to get to that level, obviously, but commercially available AR devices (i.e. typical smartphones) simply cannot support them. High-fidelity 3D models can be huge files with millions of faces and vertices. Large open worlds may have thousands of objects that require individual shadows, lighting, pathing, behavior, and other rendering considerations. User actions and interactions within a scene may require serious computational power. Without addressing these challenges and more, AR cannot live up to the wild potential of our imaginations.
So what can we do about it?Enter render streaming. Realistically, modern AR devices can’t take care of all these issues…but desktop machines have more than enough horsepower. The proof is in the pudding: we see in the examples of open-world video games previously mentioned that we can very much create whole worlds from scratch and render them fluidly at high FPS rates. So let’s outsource the work! The process of render streaming starts with an XR application running on a machine with a much stronger GPU than a smartphone (at scale, a server, physical or cloud-based). Then, each processed, rendered frame of the experience, generated in real time, is sent to the display device (your smartphone). Any inputs from the display device, such as the camera feed and touch, gyroscope, and motion sensors, are transmitted back to the server to be processed in the XR application, then the next updated frame is sent to the display device. It’s like on-demand video streaming, with an extra layer of input from the viewing device. This frees the viewing device from actually having to handle the computational load. Its only responsibility now is to stream the graphics and audio, which modern devices are more than capable of doing efficiently. Even better, this streaming solution is browser-compatible through the WebRTC protocol, meaning that developers don’t need to worry about cross-platform compatibility, and users don’t need to download additional applications.
Diagram of render streaming process using Unreal Engine. Credit: Unreal Engine 2020.There is just one problem: it takes time for input signals to move from the streaming device to the server, be processed, and have results be transmitted back. Nor is this a new challenge; we have long struggled with the same latency issue in modern multiplayer video games and other network applications. For render streaming to become an attractive, widespread option, 5G network connectivity and speeds will be necessary to reduce latency to tolerable levels. Regardless, it would be wise for developers to get familiar with the technology. All the components are already at hand; not only is 5G availability increasing, but Unity and Unreal Engine have also released native support for render streaming, and cloud services catering to clients who want render streaming at scale are beginning to crop up. The future is already here — we just need to grab onto our screens and watch as the cloud renders the ride.
What We DidAs the fashion world had to adapt and move to a purely digital landscape — fashion shows had to be pushed to video, new clothing lines had to be shipped to prospective buyers — brands had to move quickly to break through all of the noise. ROSE and Chandelier Creative helped KHAITE bring their newest collection to life. With emerging technology ROSE was able to bring KHAITE’s footwear designs to the homes of their customers, buyers and the market giving customers a deep visual experience unlike any other fashion brand has been able to accomplish. As the world continues to grapple with these unprecedented times, this technology will become a cornerstone of how fashion powerhouses market their designs to their customers. ROSE decided to build a WebAR application for accessibility purposes and to take the burden off consumers. The WebAR experience is widely-supported, deeply interactive and highlights the unique details of KHAITE’s footwear designs in a way that offers endless creative freedom for the user. KHAITE shipped lookbooks that had QR codes embedded within the experience, made by Chandelier Creative, that when scanned take you to the AR experience where users can see the shoes to scale in their own homes. This allows consumers to tap whichever shoes they’d like to get a closer look at and place them in their homes. This allowed customers to get a feel for the items without being able to see them in person. This experience allowed KHAITE to create a visual experience that otherwise would only exist inside one of their showrooms. In the second iteration of the experience, for KHAITE’s pre-fall 2021 collection, ROSE expanded the experience to include models rendered in augmented reality, allowing for users to be able to see the clothing in the way it was meant to be seen. While still using WebAR, this second experience utilized green screen video to build a full runway show with models wearing the new line as they walk up and down whatever environment the user chooses.
ChallengesUnderstanding the mathematics of 3D space is a learning curve in itself, but creating an experience accessible in a browser, as opposed to a native mobile application, makes things even more difficult with issues like sensor permissions and browser compatibility. Adding light sources to a scene requires a careful balance between the existing, real-life lighting observed by the camera and computed lighting that best accentuates the highlights and shadows of the models in the AR scene. This challenge was multiplied tenfold as we created specific lighting setups to complement each unique shoe model. The material of each model was a major consideration; a shoe with a soft, quilted insole and white leather straps needed soft, glowing illumination, whereas a black patent leather boot needed bright point lights that played off the glossy reflectivity of the material. The end result was an experience that tailors to each model, allowing users to see each one in its best light. When we started on the second KHAITE experience, we ran up against challenges that came with showcasing an entire clothing line. KHAITE is a premium brand which places a lot of emphasis on the quality and texture of the materials for their garments and accessories. WebAR is a resource constrained medium, meaning lower-file sizes and compression are required. Capturing 4K, high framerate and high-quality assets for delivery via the web is a challenge. Involving models and movement increases the difficulty of capturing high quality assets. Thankfully, we were able to get incredibly high quality green screen footage enabling the quality of the looks to shine through.
ImpactAs the fashion world grapples with how to convert sales and stay afloat amid the pandemic, finding ways to integrate experiences with seamless shopping capabilities is now the only viable option. Within this experience, the sales were proof enough that this execution works for high-fashion labels. Fashion is a tactile and textured experience, and amid social distancing brands have hurdles to jump to create moving experiences for consumers. Companies are integrating new technology to bring fashion shows to people’s phones, computers, and inside their homes. For the first experience ROSE built for KHAITE, sales increased by significantly in just a few short weeks. Evan Rose, CEO and founder of ROSE, said, “We are proud to have partnered with KHAITE and Chandelier Creative to create an experience that changes how consumers engage with physical products in an increasingly digital world. We’re excited to be a part of driving how the retail and fashion industries engage with consumers.” As this current climate continues, and shoppers continue to have decreased consumer confidence, focusing on the clothes and the experience that can be had without in-person experiences are more important than ever. Using augmented reality for elevating fashion in this time of social distancing allows for a rich, interactive experience for all users and customers. AR allows for the color, texture and life of garments to come to life.
Amid a global pandemic the solutions to some of our most basic problems need some creativity. With COVID’s continued presence in our lives, social distancing may have to continue into a time that is usually filled with parties, family gatherings and holiday festivities. People will be looking for ways to make new traditions, and to connect with their loved ones from afar.
Patrón needed a way to help customers connect despite holiday plans shifting across the country, while also maintaining their brand narrative. We worked with Patrón to create a first-of-its-kind digital wrapping as a special gift this holiday season, and beyond, to solve this specific problem. This experience provides a sentimental and original take on gifting alcohol as well as gives customers first-hand experience not just using augmented reality, but harnessing it to make something themselves.
How Does It Work?
Gifters of Patrón can use a microsite developed by ROSE to create a custom wrapping including a photo, text, and stickers that will transform into a 360-degree augmented reality (AR) gift wrapping around their Patrón bottle. This gives customers a chance to use this emerging technology in a new way that hasn’t been available in retail before.
“With COVID-19 impacting most celebrations this holiday season, we wanted to give customers a way to continue to celebrate with each other while social distancing,” Nicole Riemer, the art director on the project said. “By creating a custom wrapping, customers can take the act of gifting alcohol from an easy to a thoughtful one. During a time when you might not be able to gift in person, creating a custom wrapping with photos, stickers, and text provides that personal touch that is missing from not being able to gift it in person.”
Using WebGL in both 2D and 3D allows users to see their content change between dimensions in real time. Gifters can then use built-in recording and sharing technology to share the gift with the recipient as well as on social media.
By providing customers the ability to customize their gift of Patrón for both different occasions and gift recipients, we are showing them that Patrón isn’t the “mass brand” they think it is. This virtual gift allows distance to not be a barrier in creating something thoughtful that nurtures customers’ need for growing and maintaining their relationships.
“Creating these designs digitally allows for the process to be instantaneous and affordable, rather than waiting for something to get engraved or physically customized, without losing the ability to share that someone is thinking of you on social media,” Riemer said.
Why Use Augmented Reality?
Using augmented reality for this experience had several advantages. The most obvious one being that this experience provides a sentimental gift without having to enter a store or be in the same physical space as the recipient — helping maintain social distancing amid the pandemic. Additionally, augmented reality provides a way for users to generate their own content while maintaining the PATRÓN brand.
“The challenge with AR has always been figuring out how we can take new dimensions and connect them to the ones we’re familiar with in creative, expressive, and helpful ways,” Eric Liang, front-end/AR engineer on the project said. “The AR experiences that ROSE has previously created have each addressed that challenge by taking something important to us — something unseen or out of the ordinary that we wanted to showcase — and constructing it in the user’s world. This time, we’re handing the reins to the user. In this new collaboration, we’re letting users create and realize something that’s uniquely their own.”
Harnessing the power of AR will bring all the holiday cheer customers could be missing into the palm of their hand and inside their home — connecting people who want to be together this holiday. Additionally, PATRÓN has a history of creating limited-run packaging and bottles and this experience offers customers peak exclusivity with the ability to customize every individual bottle they purchase, so the virtual expansion of exclusive boxes was a natural progression for the brand.
In designing this web application, we identified two different types of users. As Patrón’s target demographic for this experience is 21–35, we were less concerned with the technological literacy of the user. Additionally, since this started as a concept that would be mainly pushed through social media, we were bound to attract younger users that would already be at least slightly familiar with augmented reality from exposure through SnapChat and Instagram. After determining this demographic information for our target user, the next question was what a user would want to create when using this tool. This led us to determining the following use cases:
Creator 1: The user that wants to create a really thoughtful collage that they want the recipient to see that they spent time on. They expect that their gift will be shown to others and potentially shared on social media in a similar fashion to birthday posts.
Creator 2: The user that is looking to create a quick gift that still wows the intended recipient. They want to expend minimal effort, but get the same praise and reaction as someone who spends a lot of time on their creation.
In order to satisfy the need for a quick gift, we created quick “themes” that someone can choose from at the start of the experience that allows them to upload a single photo and have created a designed bottle in 5 clicks (including previewing their design). For those that want to spend more time on their creation, we provide the ability to start from scratch and choose the content that goes on every side of the bottle.
In choosing the predetermined content that users can apply to their digital bottles, we focused on a few things. The first was to choose assets that could be used for multiple occasions, holidays, and were non-denominational. The second was to underscore the socially distant benefit of this gift and continue to have people drink responsibly even when gatherings are not encouraged. The third was to make sure that the assets could be used in many combinations and still create a wrapping that looks high end.
Once we determined the user experience and the content types that could be placed on the wrappings, we had to find a way to map their content to a 3D bottle in real time, to show the user their creation on this model before sending augmented reality link to their recipient, and then ultimately render each individual experience in augmented reality.
How We Built This
The technical inspiration for this experience began in an understanding of how WebXR, the implementation of augmented reality in a web browser, operates. WebXR is the conceptual model of everything that exists in an extended reality scene: where each virtual object is, where light is coming from, where the “camera” stands and observes, how the user interacts and changes all of these things, and so on. Imagine closing your eyes and understanding where everything around you in the room is: your desk, the floor, a lamp, rays of sunlight coming through a window, even your own hands. Now open your eyes and actually observe those things. That’s what WebGL does. WebGL is the graphics engine that takes the theoretical model processed by WebXR and paints it on a screen, rendering the virtual existence of matter and light into visibility.
While we wanted to capture the same magic of seeing something you create exist in 3D space, it was important that it would be accessible to everyone, both in terms of the technology and creativity. We wanted it to be usable from an everyday mobile device, without the need for expensive VR technology. We also didn’t want to require the user to be a painter, have an empty warehouse to dance around with VR goggles on, or have an intricate understanding of 3D sculpture or set design to maximize the reward of the experience.
There were a lot of moving parts that needed to be addressed. There needed to be a simple, intuitive interface for the user to customize their design and we needed to apply the design to a 3D model composed of a number of different materials and textures, from soft cork to clear pebbled glass to shiny metallic gift wrap. The experience needed to show that customized bottle back to the user in an interactive, attention-grabbing 3D experience. And finally, we needed to be able to scale the experience for a mass marketing campaign, which meant preparing for a large number of concurrent users with different devices and intents. We settled on technologies to address each of these challenges: a React/HTML Canvas microsite to design the wrapping, an 8th Wall/A-FRAME experience to view it, and a serverless API backend with cloud storage to support scale.
The next step was to source a 3D model of the bottle and we worked with a 3D artist and modeller and iterated over the model until each detail was as accurate as possible, and then continued to optimize our renders.This involved adjusting lighting through trial and error until we found the best setup to illuminate the bottle and make the glass and its reflectiveness as lifelike as possible, as well as customizing the physical material shaders for each node of the finalized model: the cork, the ribbon, the glass, the liquid, and the wrapping.
Later on, we also realized that we needed a dynamic approach to the wrapping’s transparency. If the user chose to lay their graphics directly over the glass without using a background, those stickers, photos, and text would need to be opaque while leaving the glass transparent. The answer was taking the texture maps we generated with each user-created design and filtered them into black and white, effortlessly serving double duty as alpha maps to control transparency.
While the experience would be accessible to everyone, we wanted those who had a Patrón bottle handy to be able to integrate it into the experience. It’s not yet feasible to use a real-life bottle of Patrón to anchor the experience, so we looked outside of the box — and settled on the actual box that each bottle of Patrón comes in. This gave us the opportunity to leverage 8th Wall’s image target feature, using Patrón bottle image on the side of each box to trigger the dramatic emergence of the virtual bottle from the physical box.
Those without a box can watch the bottle appear on the plane they have placed it on in the experience. Adding some typical controls like pinch to zoom and finger rotation made it easy for the user to examine the bottle and the details of the design, and we added in 8th Wall’s Media Recorder capability to further boost the shareability of the experience.
As companies look ahead to a greener and more sustainable future, the concept of virtual wrapping and virtual packaging is likely to expand. As augmented reality moves from an emerging technology to an adopted one, user-generated AR content will take center stage, and experiences like this one will enable every day users to create using AR technology. As all industries grapple with how to stay competitive, and stay afloat, innovation is the answer to moving forward. This is the tip of the iceberg when it comes to what augmented reality can accomplish.
We are excited to continue innovating and bringing projects like these to life. We believe anyone can innovate and that process is vital amid the current economic landscape. Our passion for emerging technologies and augmented reality is immense and our work will only continue to reflect that. We’re looking forward to sharing more soon.
Ashley Nelson: Concept and Strategy, UX Copywriter
Eric Liang: Front-end/AR engineer
Eugene Park: Experience Design
Leonardo Malave: Back-end/AR engineer
Marie Liao: QA Engineer
Nicole Riemer: Concept and Strategy, Art Direction, and Experience Design
Yolan Baker: Project Manager
As a black-owned business, the current state of the world has changed ROSE’s daily motions as a company. We view ROSE as a vehicle for improving the world, both in how we support each other internally, and the impact of the products we bring to life. We also acknowledge that as a Black-owned tech firm, we have an innate privilege with our platform. We have the ability to create change through technology, and with that privilege comes a deeply-rooted responsibility. Bail Out Network was our way of assisting in the fight for Black equality, without distracting from the injustices currently happening.
After the death of George Floyd sparked protests across the world, we found ourselves in conversations with the entire staff about how we could make a positive impact. We quickly saw the systematic arrest of protesters around the country, and the flood of donations to community bail funds that have been integral to fighting police overreach for decades.
We wanted to make locating these funds, and their donation portals, as easy as possible as the police violence became clearly visible on a daily basis. So we scoured the internet looking for every bail fund we could find, and any lawyers or law firms that vocalized their willingness to represent those arrested for protesting free of charge. The website was launched in under 24 hours, on June 2, as a rapid response to the grossly visible police violence that was being seen across the country.
This project started as a way for us to collect bail funds in one place, for people looking to lend a hand in their communities as police began systematically targeting protesters. As more time passed, we wanted to expand how Bail Out Network could be utilized as a resource in the fight against police brutality and equity for Black and brown bodies.
The site now has a collection of resources that are dedicated to helping the most marginalized communities — focusing on the Black Trans community, Black LGBTQ+ community, Black youth, Black incarcerated people and other groups that lack protection from this country’s institutions. The site remains open for submissions, and if people come to the site and want to submit new resources they can add them at any time.
We firmly believe, as should all people, that Black Lives Matter and the people protesting should not face punishment for doing what is right. As people continue to protest and fight for the destruction of systemic racism and demanding a complete overhaul of the United States policing system, they’re going to need more money, more resources, and more bodies to achieve these goals. We will continue to promote organizations that are protecting those who have been disenfranchised by the current political, economic, and social structures that exist within the fabric of the United States. This database will be updated as more resources become available.
Launched in June 2020, the idea for this project built upon Rose Digital’s history of using technology for good in times of public crisis (see also, Help or Get Help). Ashley Nelson, copywriter, originated the idea and identified the need within the current environment to connect those protesting with legal aid and easy access to resources. Nicole Riemer, art director, created the visual direction and UX of the site with Ashley writing the copy.
Bail Out Network will continue to consolidate resources that anyone can use to access resources helping the Black Lives Matter movement. For more information on ROSE, please visit builtbyrose.co.
Believe it or not, a few short months ago the main event dominating the news cycle wasn’t coronavirus, but the Presidential election. The Democratic primaries were different from years past, and not just because the number of candidates running could fill a small football field. One thing that stood out to our team was the record spending that occurred this election cycle. Discussions began to swirl around campaign finance specifically when Michael Bloomberg entered the race, funding his entire campaign with his personal fortune, and raising questions about what money should and shouldn’t buy while running for office. We began thinking about a way to contextualize the immensity of campaign spending through the language we speak best — technology. Those conversations and the desire to use technology to answer that question was the origin of Pay to Play. Due to primaries being postponed, and the race being narrowed down to single candidates from each party, we considered not releasing this experience.
However, with the new economic pressures on American families due to coronavirus and the current volatile international economy, we believed the relationship between money and politics was worth exploring. This project considers the disconnect between the monetary impact of the political process and the needs of everyday Americans.
The staggering amount of money spent by Democratic candidates in the 2020 election left us wondering how that money could have been spent on infrastructure and funding the platforms that those candidates had as part of their campaigns. We designed Pay to Play as a way to look back on the record amount of money spent by Democratic candidates that have ended their bids. We also included how much several Republican contenders in the 2016 presidential election spent on their campaigns as another comparison.
We designed this experience to visualize our internal discussions and the conversations happening in the U.S. during this tumultuous time, and in doing so we wanted to answer the question: “What else could we have done with that money?”
How Does It Work
Pay to Play was developed using 8th Wall as the hosting platform. The only web specific AR toolkit, 8th Wall allows anyone with a mobile device and an internet connection to access the comparative experience. Users can compare campaign spending amounts from the top seven Democratic candidates who spent the most on their presidential run, as well as the top seven Republican candidates from the 2016 presidential election. The experience has different “common good” filters and each “common good” filter has been paired with a representative 3D object that will fill the space with the appropriate scaled number of objects. With each selection, the data will simultaneously update in the upper left corner.
Why Use Augmented Reality
Using augmented reality for data visualization allows for emotional reactions from the user. This experience showcases the immensity of campaign spending by using cascading scaled objects that fill the users view, as though it could overflow from the screen at any moment. This experience was created using 8th Wall, which meant decreasing file size and the number of objects rendered is important for optimizing load time. To speed up load time and allow for easier comparison, the number of objects was scaled. While AR can make data more manageable for users, it can also create emotional connections through hands-on participation with the product.
The Build, 3D Modeling, and Optimization
We found that the best way to offer an immersive extended reality experience, while still offering relevant information and options to a user, is to combine the XR portion with a heads-up display that lies on top without obstructing the view. As such, this project could immediately be divided into two parts: building the HUD and coding the 3D model portion. We used A-Frame, a 3D framework built on Three.js and HTML, to bridge the gap. By representing our 3D assets and behaviors as HTML, we were free to create our HUD in pure HTML and have it communicate and interact seamlessly with the A-Frame components.
We found that much of the challenge of this project was using AR in a way that was accessible to as many people as possible while still maintaining the core identity of the project — using numerical scale as a way to evoke a reaction from the user. Rendering any 3D model in a web browser can be an expensive operation. Rendering thousands of them would tax a smartphone’s hardware to the point of unusability. We ended up approaching this by leaning into the idea of scale: we didn’t need exacting detail if the idea was to overwhelm the user with a huge pile of items; we just needed enough to make it clear what each item was. So we selected simple models with fewer polygons, decimated their numbers of faces as low as we could, and reduced the resolution on their textures to minimize file size. The end result worked out — we had piles of apples that were clearly recognizable and deeply satisfying to watch cascade down from the sky.
Additional challenges came from the technologies we used to build the experience itself. Web AR platforms advance every day, but there are still severe limitations to their capabilities. For example, 8th Wall, the platform on which this experience runs, offers surface occlusion capabilities only for its Unity integration into native apps. For browser-based experiences that don’t yet have access to that plane detection technology, we have to emulate a floor by placing a vast invisible sheet at a defined distance below the camera. The distance to the “floor” is not dynamic and doesn’t change whether the user is sitting or standing, resulting in an imperfect representation of reality. This process only makes us more excited to see the next steps web AR will take, as the technology continues to improve and provide us with new and even more compelling ways to augment our reality.
The political process is often a complicated and convoluted one, and accessing data on campaign finance can be overwhelming. Conceptualizing how much candidates spend on their campaigns shows the immensity of American politics. By using AR, it becomes easier to visualize the power that the people funding these campaigns have, and raises real questions about the possibility of sweeping change if these funds were made available.
Jordan Long: Concept and Strategy
Nicole Riemer: Art Direction and Experience Design
Eric Liang: Experience Design and Development
A little over seven years ago, I started a business doing what I love — building apps. It started small. $50 WordPress project on Craigslist small. I didn’t know then that this dream of mine, building things, would turn into my life’s work.
Over the years I’ve spent building this company, I’ve learned by watching, reading and talking — but mainly by doing. By doing I mean burning my hand on all of the entrepreneurial hot stoves until I got things (mostly) right. I want to leave you with some lessons I’ve taken away in my seven years of blood, sweat, and tears at Rose. I hope something here inspires the next entrepreneur on the brink of something great.
Love the Game
You can’t survive the long desert of toiling for years unless you love what you’re doing. If you’re going to dedicate years of your life to something, you had better be sure you love it. You also have to be honest with yourself. If you fall out of love with what you’re doing, there is no shame in cutting losses and doing what you need to do to feel happy and be fulfilled.There have been times I wondered what I was doing and whether it was worth it, but I always returned to the tech and why I got started in the first place.
You Can’t Master All Trades
During the growth curve of Rose, there isn’t a job I haven’t done or a hat I haven’t worn. That wasn’t always for the best. As the firm grew so did my “job description.” I became manager, HR Supervisor, Salesperson, Lead Engineer, Design Director, Chief Snack Selector, Janitor, and a million other roles in between. While that ethic of being willing to be flexible with what your “job” is on any given day is important; what I realized quickly is that by being a jack of all trades, I was a master of none. As a founder it can be hard to delegate and even harder to admit that you don’t know what you don’t know. Focusing the scope of your role as a founder and executive is key to the success of any endeavor.
Hire Awesome People
This one is simple. You have to hire the best people you can find. They will make your life easier and your company better. The trick is knowing how to find and evaluate people and there is no easy way to do that because the target keeps moving at fast-growing companies. The right person for a job at 5 people might not be the right person for the job at 15. I’ve found that finding sharp people who are ‘roll up your sleeves’ type folks can roll with the punches and adapt to fast-changing times.
Believe in Yourself
The first few years of my business I always felt like I was the kid in a room full of adults. This pushed me, but didn’t make certain conversations any easier. Learning how to not be hesitant to send invoices and having hard client conversations had to be done because my company was worthy of the work we were winning. Once you’re in the room, nothing else matters. You are in the room for a reason and you have to own that and take any opportunity that comes your way.
Becoming happens somewhere in the process of doing. The doubts about whether success will come, or the doubts about whether you’re good enough, in those moments is where you find yourself and your business. There were many late nights, hard decisions, failures, and successes in the last seven years, but I would not choose to change any of it.