Product Review Template Re-Design Case Study
Purch (currently FUTURE) reviews products across over 400 categories. As part of the site's overhaul, we sought to re-imagine the product review experience to accommodate consumers at different phases of the decision-making process. To understand how our current template performed vis-à-vis competitors, I ran a moderated usability test in order to establish a baseline for our templates and gain a nuanced understanding of how users in fact wanted to interact with review content in order to design a template that aligns with how they evaluate purchase decisions.
Moderated Usability Test
& Choosing a Methodology
Establishing Project Goals
Aligning on objectives with stakeholders
At the outset of the project, I met with stakeholders across our product, engineering, editorial, SEO, and sales teams in order to align on what they wanted the new Product Review template to achieve and gain insight into the limitations of the current template from their perspective. We had a working session in which every stakeholder wrote down the project goals most important to them on post-it notes and then everybody had three votes to select the most important goals. The goals with the highest votes were grouped based on themes.
Ensure the template includes robust modules for nuanced product categories, but can be scaled back for simpler product categories without forcing editors to fill out unnecessary fields or create workarounds in the CMS
Ensure that there are opportunities to further improve the template in order to further increase the sales of sponsored templates
Provide contextually relevant recommendations based on where the user is and how the product review performed in that subrating.
Robust, Meaningful Data
Provide users with all the data they normally seek out before making a decision in a digestible, yet comprehensive way including customer reviews, retailer prices, and spec comparison.
Moderated Usability Test
Compare how users react to our product review experiences versus competitors across a multitude of categories to understand the varying needs the template must meet across categories
Identify the key drivers of delivering a pleasant and trustworthy product review experience, which includes the information types that resonate with users, their placement on the page, and how to deliver that information in a way that aligns with users' decision-making paradigm
Gain insight around how users interact with and value the various elements on the page, included, but not limited to the spec charts, scorecard containers, table of contents, and buy buttons
Why a Moderated Usability Test?
Given that our goals were to understand what underpins user behavior, sentiment, and their overall decision-making process, it was important that I had an opportunity to not only understand what users were doing, but also why they were doing it and the emotional dimension of their experience. A moderated usability test would allow me to observe users interacting with our pages in real time and probe into the reasoning behind their reactions.
As decision-making journeys vary greatly across categories, I chose three drastically different categories in order to gauge how well the template supports different product types
23 Total: 13 Females: 10 Males
Participants have purchased or considered purchasing the item being reviewed (laptop, vacuum cleaner, or auto insurance) in the last year
Familiarity with popular review sites within each product category and a history of leveraging review sites to inform a purchase decision
Imagine that you are in the market for a laptop/vacuum cleaner/auto insurance. You start your search on Google, read a few articles highlighting the most popular products in that category. After noticing that the Razor Blade 15/Dyson Animal 2/Nationwide auto insurance appeared in multiple reviews, you do a Google search for "Razor Blade 15/Dyson Animal 2/Nationwide auto insurance review" and click on this result.
Spend 2-3 minutes exploring the page as if you were evaluating this product based on your own criteria. Feel free to click anything on the page as you normally would, but try and find what you are looking for within the page first. When you have exhausted the content or feel like you would go elsewhere, please stop and let me know.
Study Findings & Design Solutions
High Quality Review Summaries Were Critical to Further Engagement With the Content
Participants sought out and expected summary level data such as pros, cons, or verdicts immediately to assess whether or not they should invest any additional time evaluating the product reviewed. In particular, they were looking for “deal breakers.”
Evaluation Tools to Compliment Content
Although participants found the editor’s insights and experience handling the product useful, they wanted to be able to assess the product on their own terms by being provided with digestible spec sheets, comparison tools, aggregated customer reviews, and graphs highlighting how the product performs compared to other well known or high performing products in that category.
Contextually Educating Users Throughout the Review
The participants had varying degrees of familiarity with the product category and as a result, sometimes came across terms they did not understand. Providing accessible way to clarify jargon helped alleviate confusion and made participants more willing to engage with the rest of the content
Pricing Options Enhance Trust in Buy Buttons
Participants were often skeptical of buy buttons that were not accompanied with additional prices that reinforced that the recommended option was the best and therefore trustworthy. Nearly all participants mentioned that they regularly Google the product name to find unbiased pricing information. Mimicking this experience on-site could potentially deter users from seeking this information out on Google.
Mental Models & User Journeys
After completing the moderated usability study, the design team and I began to construct different mental models of users the design solution needed to support or accommodate in some way. We intentionally abstained from creating personas out of study participants because we had not yet had a chance to compliment findings with a larger scales surveys to solidify certain inferences we had made about our users. Instead, we ascribed each finding to their corresponding mental model to paint a more granular picture of how they might interact with each of the current and future modules on the page.
This allowed us to address detailed questions in our design solution such as:
Visualizing Price:Quality Ratio: How can we go beyond simply providing specifications and ratings and instead provide meaningful context around whether or not the product reviews performs well enough for its price point? How can we empower users to easily identify viable options to evaluate further on our site?
Social Proof: As we established that users will reference multiple sites to identify trends in ratings, how can we thoughtfully surface peer expert reviews in order to decrease the need to reference other reviews? Additionally, how can we surface customer reviews in a way that is more actionable than what is found on Amazon?
Contextual Suggestions: Users typically only engage with the subsections of the review that matter to them (battery life for example). In event the product reviewed performed poorly on a given metric, how we can serve up a recommendation that does meet that need without deterring users from considering that product at all.
Price Comparison: How can we display pricing data in a way that users trust and adds value beyond simply showing them the cheapest price available?
Crazy 8s (Design Studio) Sketching Session w/ Design Team and Stakeholders
After I shared the usability test results, I facilitated a session during which each stakeholder had an opportunity to design eight different design solutions and then vote on the designs that resonated with them the most. This provided an opportunity for each department to participate in the design process and voice their ideas, concerns, and priorities.
Once my fellow designers and I aligned on research findings and stakeholder sentiment on how we might want to approach certain challenges and opportunities, we moved into the wireframing phase. At this point, I collaborated very closely with our UI designers to ideate on the different approaches we could take to the various design and technical challenges we faced to create the solutions that would enhance the Best Picks user experience based on our usability test findings and analytics data.
Some of the agreed upon enhancements included the following:
A more robust summary section that highlights pros/cons and a verdict as well who the product is most suitable for in some cases
Surface customer reviews and peer publication ratings
Include alternative retailer prices alongside our traditional buy buttons that highlight recommended retailers and display how many users chose each one (social proof)
Visualizing price:quality ratio and providing users with an opportunity to easily identify alternatives based on overall performance or on the metrics that matter most to them
Provide contextual recommendations based on the section of the article you are engaging with
Results and Retrospective
Fortunately, due to the highly collaborative nature of our process, there were was almost no pushback during design review with product stakeholders when we presented mockups for the new product review template. The new template provides editors with robust new tools to showcase the products they review, provides users access to nearly all the data they seek out during their decision-making process, and is flexible enough to transition our current product reviews with little to no modification. However, due to FUTURE's acquisition of Purch, most active projects have been put on hold until further notice.