Best Picks Template Re-Design Case Study
& Choosing a Methodology
Best Picks pages across all of Purch’s decision intent sites are the most revenue generating pages, but performance was not meeting expectations. Many of the modules designed to drive traffic to other parts of the site or to the retailer page were not converting. Due to the nuanced nature of purchase decisions, we sought to understand what kinds of information users are looking for at different parts of their journey, how they engage with our content, and how we can instill trust in new visitors (87% of traffic) so they can confidently make purchase decisions based on the information we provide.
Moderated Usability Test
Establishing Project Goals
Aligning on objectives with stakeholders
At the outset of the project, I met with stakeholders across our product, engineering, editorial, SEO, and sales teams in order to align on what they wanted the new Best Picks template to achieve and gain insight into the limitations of the current template from their perspective. We had a working session in which every stakeholder wrote down the project goals most important to them on post-it notes and then everybody had three votes to select the most important goals. The goals with the highest votes were grouped based on themes.
Trust & Credibility
Establish editorial authority and trust with users by demonstrating that in-house experts actually test the products being reviewed and provide unbiased reviews
Create a memorable review experience that increases the likelihood that users would arrive on our page through direct traffic or a branded search (i.e. Tom's Guide Dell XPS review)
Revenue Generation + Conversion
Create an experience that provides users with the information they need to make a confident and informed purchase decision through our site
Since this template needs to support over 400 product categories, we need to provide users flexibility in how they engage with our content
Support Related Content
Differentiate our discovery experience by predicting user reactions to certain aspects of a product and ensuring that we provide something else that may suit their needs in order to enhance trust and keep them on our sites
Moderated Usability Test
Measure the user experience of the partially implemented Best Picks template in terms of our experience quality metrics (user confidence, trustworthiness, decision enablement, etc.)
Understand how users interact with the various elements on the page including, but not limited to, the comparison table, best picks summaries, table of contents, and buy buttons.
Assess how we might enhance the page in order to better align with how users evaluate a purchase decision
Define which aspects of the redesigned template will make the most impact on the user experience and prioritize implementation accordingly.
Why a Moderated Usability Test
Given that our goals were to understand what underpins user behavior, sentiment, and their overall decision-making process, it was important that I had an opportunity to not only understand what users were doing, but also why they were doing it and the emotional dimension of their experience. A moderated usability test would allow me to observe users interacting with our pages in real time and probe into the reasoning behind their reactions.
As decision-making journeys vary greatly across categories, I chose two drastically different categories in order to gauge how well the template supports different product types
16 Total: 11 Females: 5 Males
Participants have purchased or considered purchasing the item being reviewed (a laptop or pressure cooker) in the last year
Familiarity with popular review sites within each product category and a history of leveraging review sites to inform a purchase decision
"Imagine that you are in the market for a laptop/pressure cooker. You go on Google to search for "Best laptops/pressure cookers 2018" and land on this page. Spend a few minutes exploring the page to find the product that best suits your needs and stop when you have exhausted the contents of the page or feel you would go somewhere else to continue your research."
Study Findings & Design Solutions
Pages Lacked Key Decision Enablement Info
The page lacked key information types that are considered essential to making an informed purchase decision such as alternative retailer prices, customer reviews, and photography that mapped back to features, highlights, or drawbacks of the product
Comparison Table Interactivity
Although participants found the comparison table helpful, it was cumbersome to find the attributes that mattered most to them. The ability to interact with the table in order to organize the data in a way that reflects their needs would make the data more actionable for them.
Content Re-enforcing Trust Was Inaccessible
Trustworthiness and user confidence were negatively impacted by the inaccessibility of the information found in the “why trust us” section, which most participants weren't exposed to until they were prompted to look at it
Orienting Users to What is on the page
Navigation to content beneath the comparison table was hindered due to a lack of indication of what was contained on the page as well as the placement of the comparison table
Mental Models & User Journeys
After completing the moderated usability study, the design team and I began to construct different mental models of users the design solution needed to support. We intentionally abstained from creating personas out of study participants because we had not yet had a chance to compliment findings with larger scales surveys to solidify certain inferences we had made about our users. Instead, we ascribed each finding to their corresponding mental model to paint a more granular picture of how they might interact with each of the current and future modules on the page.
This allowed us to address detailed questions in our design solution such as:
Buy Button Optimizations: How can we help users to understand why we featured a particular retailer/price in order to enhance trust and increase the likelihood that they will feel confident in purchasing through our sites?
Depicting Total Cost of Ownership in Comparison Tables: How do we communicate to users the true cost of owning a product for product categories such as GPS trackers, exercise bikes (Peloton), or other categories where the base product isn't reflective of the total cost of owning a product?
Comparison Table Attributes: How to communicate specs in a way that is easy to digest and actionable?
Social Proof: How can we aggregate customer reviews in a way that is trustworthy and actionable? What are the different ways we can visualize and organize them?
Crazy 8s (Design Studio) Sketching Session w/ Design Team and Stakeholders
After I shared the usability test results, I facilitated a session during which each stakeholder had an opportunity to design eight different design solutions and then vote on the designs that resonated with them the most. This provided an opportunity for stakeholders to participate in the design process and voice their ideas, concerns, and priorities.
Once my fellow designers and I aligned on research findings and stakeholder sentiment, we moved into the visual design phase. At this point, I collaborated very closely with our UI designers to ideate on the different approaches we could take to the various design and technical challenges we faced to create the solutions that would enhance the Best Picks user experience based on our usability test findings and analytics data.
Some of the agreed upon enhancements included the following:
Surface customer reviews
Include alternative retailer prices alongside our traditional buy buttons that highlight recommended retailers
Including videos and photos within product blocks so that users have access to more interactive and engaging content without having to go to the full review.
Allow users to sort the table based on a particular attribute such as price, screen size, etc. The table could feature arrows or some other UI elements that users can click/tap on in order to sort the table based on a particular attribute.
Make the use case labels visible to users as they are analyzing the contents of the comparison table.
Filtering and sorting by individual sub-ratings (design, battery life, etc.) or specs to allow users to find products that meet specific criteria or are best suited for a particular use case
Strategically placing the "Why Trust Us?" section in a place that increases the likelihood that users will encounter it while not obstructing review content they came to the site to consume.
Study participants were less likely to trust pricing we provide if we only featured one retailer due to trust concerns around affiliate sites. In order to increase transparency and enhance trust, we included pricing from multiple retailers.
Study participants occasionally came across jargon they didn't understand, which inevitably frustrated them and limited the utility of the comparison table. As users will have varying degrees of familiarity with specs as well as our testing metrics, we included a tooltip to educate users on unfamiliar terms and jargon.
Although users and study participants valued our rankings, they wanted to leverage the comparison as a tool to rank products based on the criteria that mattered most to them, so we included quick filters that would be specific to each user needs/use cases for each product category
Participants often grappled to understand certain specs, especially for categories such as kitchen appliances or gas grills. In order to improve comprehension and reduce cognitive load, we started exploring ways we could re-frame measurements in instances where the unit is not common knowledge.
During usability tests, most users pointed out the publishing date and cited it as an important factor influencing trust. The updates timeline provides users with a way to easily see how the review has evolved and builds trust
In addition to including pros/cons, customer reviews were included in order to encompass a critical part of the decision-making journey into an expert review.
Seeking out social proof was undoubtedly one of the most important parts of the decision-making journey for most participants (across multiple studies). Including customer reviews on our pages reduces the need to leave our page to find this critical information.
Study participants often didn't get past the comparison table to the product blocks detailing our top ranked products. In order to combat this, we modified the table to start out in a collapsed state, but provided affordances to users that it could be expanded. Since most users were only going to consider the top three to four products, we felt comfortable choosing not to display this information upfront.
Results and Retrospective
We took a phased approach to the re-design based on a number of technical dependencies. Based on usability test results and analytics data, we prioritized the highest impact enhancements that did not have any technical dependencies with the product team. Overall, I have been very pleased with the outcomes of the Best Picks project. My team and I were able to successfully incorporate departments with very different KPIs into the user-centered design process rather seamlessly and had fun while doing it too.
What Has Been the Impact Thus Far? (3 months)
Increased conversions (buy button clicks) on Best Picks pages by 250%
Average time on page has gone from 45 seconds to 1 minute and 53 seconds.
Related product click throughs has increased by nearly 180%