Case Study: Media Carousel
At Beats, I am the lead UX designer on the Product team within the cross-functional Digital Operations team. The Digital Operations team consists of four closely aligned teams: Creative, Research + Analytics, Engineering and Product. Together, we build and maintain the Beats website – the company’s main hub for product marketing and support.
My job as lead UX designer is to work cross-functionally within the Digital Operations team to guide the design of new products and components and facilitate website builds and enhancements throughout the development lifecycle.
Beats by Dr. Dre
Analyzing user behavior, we compared the performance of BBD.com product pages with the performance of product pages on the Beats Amazon store. The study was part of a campaign to pivot away from storytelling toward an eCommerce experience. What we found was that users who interacted with the Amazon product carousel were converting more often. Furthermore, the carousel users spent most of their time with the carousel viewing product demos, unboxing videos and fit tutorials.
We knew we wanted to host more video on our product pages but the existing site offered little support – the video player was feature poor and unable to host multiple assets. It also lacked responsiveness on mobile, which accounted for 70% of BBD.com traffic. We needed to develop a brand new component and deploy it within a single release cycle – just in time to show off a new earphone product for the holiday season.
As lead UX designer for the project, my job was to guide the design process, communicate with stakeholders and collaborate with the creative and engineering teams throughout the entire development lifecycle.
I owned the product, led design and research and created prototypes while working closely with frontend developers and engineers.
I started by working with my immediate team to perform competitive analysis. We gathered insights and inspiration from multiple sources and compiled our findings in a shared document. After coming up with a plan and presenting it to leadership, the next step was to develop user story.
When creating user stories, I embrace the concept of vertical slicing. I break each story down into self-contained, actionable sections that describe the user interaction and expected behavior. I often include visual aids like wireframes and storyboards, highlight design specifications down to the pixel and sometimes include static or interactive prototypes using Adobe XD or even just plain HTML code.
Each user story is a living document meant to be iterated upon during the development cycle. The user story is both a guide and collaborative document where team members can comment and contribute.
Since the component is highly interactive, I created storyboards to describe the user flow. The storyboard is a supplement to a detailed description in the user story.
The storyboard shown below (fig. 1) illustrates the user interaction flow for both media types, video and image.
After the user story is initiated and user flow is mapped out, I then begin working on basic, low-fi wireframes that describe the design in more detail. I markup the wires with design specifications and provide links to working prototypes that include measurements and basic CSS styles.
The wireframes shown below (fig. 2) illustrate the basic design specifications for desktop and mobile in the initial state of the component.
Fig. 1 - Storyboard Examples
Fig. 2 - Wireframe Examples
At this point we begin the discovery phase, a one- to two-week process where we meet with the engineering team to assess feasibility and provide clarity, where needed. The main purpose of the discovery phase is to align with the engineering team before handing-off the project for coding and deployment.
Coding + Deployment
After the discovery phase is complete and the user story is handed off to the engineering team, coding and deployment begins. During this time, I stay in close contact with the engineering team to clarify any questions and iterate on the user story, wires and overall design.
Once coding is finished, the component is deployed to the testing environment where it’s thoroughly tested by our internal product team. The testing phase lasts for up to three weeks, in which time the component is updated and refined.
Implementation + Analytics
And finally, the component is deployed to the production environment and implemented on the live site. In this case, the media carousel was launched with a newly designed product page for our Beats Fit Pro earbuds and shortly after, added to other product pages.
Working with the Research and Analytics team, we built analytics functionality into the component itself which we later use to track user events which tell us which assets were viewed and in which order. We can also track how much time was spent with the component and follow the user journey from component interaction to conversion. We found that most users interacted with the new component when viewing the page and were converting at a higher volume. Lastly, we A/B tested the new product page and compared performance against the old page, and we found a substantial increase in traffic on the new page and a higher chance of conversion.
The media carousel was developed in response to a need for more product-focused videos. Research had discovered that users were gaining insight into our products by watching product demos, unboxing videos and fit tutorials. Designing a robust component that could host video along other media types ensures that we are able to meet that need now and in the future, as we add more and more engaging content to the site.
Another important aspect of the media carousel was localization. Our team manages 27 sites translated in 16 different languages. One feature we lacked was the ability to generate live subtitles for videos. Instead, we relied on the production team to deliver assets with bake-in subtitles which did not meet our user accessibility standards and would often lead to production delays.
To address this, I proposed an enhancement to support VTT files which generate live subtitles and the feature was launched concurrently, in same release cycle. The feature allowed us to have more control over translated video content, enabled us to meet user accessibility standards for the deaf and hard-of-hearing and added much needed screen-reader support for video content across BBD.com.