The age of the internet has led to a major shift in the world of Retail. More and more, consumers interact with physical stores as showrooms, to browse and test products, returning to the web to make their final buying decision. While online warehouse giants absorb an increasing percentage of the market share, brick-and-mortar stores are developing creative new strategies to offset operational costs and complement omni-channel strategies.
Because brick-and-mortar stores share equal responsibility as marketing extension and point of sale, it’s increasingly important for the physical store to provide satisfying, polished or unique customer experiences to increase brand awareness, customer loyalty, and return on investment. Presentation matters.
Now, there is a new player in town that will shift the way products are presented and consumed. Augmented reality will not only change digital marketing and brand experience as we know it today, but also the way businesses build and deliver their products and services.
We decided to explore some common retail use cases to get the juices flowing and to familiarize ourselves with some of the industry’s toughest challenges.
To maximize sales, retailers utilize 2D planogram software to design data-driven product displays and store layouts. But, while generic designs can be generated for corporate level sales initiatives, they don’t account for local buying behaviors. Localized planogram designs require time and personnel to maintain and implement effectively. Likewise, manual inspections of merchandising criteria and planogram validations are an on-going resourcing challenge.
At a store level, there has been a trend of changing consumers (less loyal and more selective), changing demographics, increased out of stocks, intensified competition, blurred channels, eroding margins and reduced merchandising effectiveness. Better informed, local planograms mean upfront investment and increased resourcing requirements.
With xR visualization tools, retail planogrammers can reference historical and forecast sales data to inform product displays and store configurations as they design. Creating and editing planograms in real 3D space removes the abstraction from the customer’s buying journey, saving time, money, and unnecessary confusion.
Visual merchandisers can reference these plans in context to local, in-store fixtures, annotating and resolving discrepancies. Real-time collaboration helps retailers and merchandisers balance local inventory demands and fixture specifications, to cut down delays in communication and corrective action.
At 8ninths, our process begins with discovery—investigation into the industry and competitive landscape, user groups and pain points for our target market. The 8ninths Retail Experience is intentionally aimed at a wide user pool to demonstrate a variety of use cases that are interrelated, and often, interdependent. Retailers work closely with visual merchandisers who may be consulted by manufacturers to more effectively design products and packaging to compete with others on the shelf. Our goal was to nod to direct benefits for each of these three groups.
User Research & Narrative Flow
In research and interviews with retailers and data insights experts, we mapped out priorities for application features. Examining a variety of user scenarios for retailers, merchandisers and manufacturers, we identified common ground in a single user flow that could highlight direct benefits for each user group. The retail planogrammer’s user flow provided an excellent format to express data integration, collaboration, annotation, and editing in 3D space. Narrowing to this structure helped to inform and focus the design as it moved forward into the next phase of interaction exploration and rapid prototyping.
Following true agile methodology, our team split to conquer increment tasks in carefully planned sprints. Storyboards evolved quickly, from whiteboards to paper to whiteboards again, overlaid with annotations of gaze, voice and gesture mechanics. With fundamental interaction rules locked, we proceeded into development, and rough UI moved into production for rapid prototyping in 3D world space. All the while, our 3D artists began modeling our miniature store, athletic shoes, display wall and shelves, optimizing for the best fidelity and lightest weight possible for our Hololens build.
Voice Over Narration & Guidance
With execution of the environment, objects, and user controls underway, design turned back to the scripted storyboards to layer in a plan for voice over narrative. Our first release was slated for the Hololens platform. Knowing that the majority of our target audience would be new to the technology, we were careful to incorporate guidance for inputs, gaze, air tap, air tap and hold as they related to required user tasks, and backup cues for troubleshooting.
The narrative sets the scene of the problem space, and invites the viewer to take control of the problem solving. As viewers explore, they are empowered—by data insights and the flexibility of working in 3D with hyper-realistic holograms—to build a beautiful, optimized store display.
UI Design Patterns
One of the on-going questions in the xR space is the role and presentation of UI. In our time demoing this technology, we’ve seen a lot of confusion with new users to VR and AR with the earliest tasks of basic interaction and navigation. 2D versus 3D elements created a clear classification for holograms as either standard control mechanisms or discoverable objects for interaction.
I tested a variety of UI styles in the lens, and the more complex the UI—the more skewed with 3D nuances—the more we confused our users. So instead of a 3D box, the inventory feature—for example—sits cleanly in a frame with a leaderboard of data representing each object. We specifically chose to mix the metaphor here, serving up 3D shoes next to the 2D data to cue to the user it’s direct physical relationship to the scene. The inventory from the “back of store” is literally available at your fingertips with augmented and virtual reality.
Narrative Guidance & Usability
Voice over narration tested well with users to familiarize them with the problem space and unique benefits of xR capabilities. But once users were empowered with basic interaction learning, their focus was distracted from subsequent storytelling and interaction cues.
To address this for our next iteration, we’ve designed a system to prioritize visual over auditory cues for user required user inputs. Dynamic gaze indicators can point the user toward any required focal points, allowing more down time between narrated clips for the user to take in information and play. To further empower the user, and allow for a customized, organic experience, we’re developing a non-linear format to feature exploration. Users will access new information at their own pace, and hopefully come out with a better understanding of the feature set and implications.