Product Review Template Re-Design Case Study

Project Background

Purch (currently FUTURE) reviews products across over 400 categories. As part of the site's overhaul, we sought to re-imagine the product review experience to accommodate consumers at different phases of the decision-making process. To understand how our current template performed vis-à-vis competitors, I ran a moderated usability test in order to establish a baseline for our templates and gain a nuanced understanding of how users in fact wanted to interact with review content in order to design a template that aligns with how they evaluate purchase decisions.   

My Contributions

Moderated  Usability Test

network (4).png

Facilitated Ideation 



Establishing Goals 

& Choosing a Methodology

Establishing Project Goals

Aligning on objectives with stakeholders 

At the outset of the project, I met with stakeholders across our product, engineering, editorial, SEO, and sales teams in order to align on what they wanted the new Product Review template to achieve and gain insight into the limitations of the current template from their perspective.  We had a working session in which every stakeholder wrote down the project goals most important to them on post-it notes and then everybody had three votes to select the most important goals. The goals with the highest votes were grouped based on themes. 

Content Strategy

Ensure the template includes robust modules for nuanced product categories, but can be scaled back for simpler product categories without forcing editors to fill out unnecessary fields or create workarounds in the CMS

Sales Performance

Ensure that there are opportunities to further improve the template in order to further increase the sales of sponsored templates  


Provide contextually relevant recommendations based on where the user is and how the product review performed in that subrating. 

Robust, Meaningful Data

Provide users with all the data they normally seek out before making a decision in a digestible, yet comprehensive way including customer reviews, retailer prices, and spec comparison. 

Moderated Usability Test 


  • Compare how users react to our product review experiences versus competitors across a multitude of categories to understand the varying needs the template must meet across categories 

  • Identify the key drivers of delivering a pleasant and trustworthy product review experience, which includes the information types that resonate with users, their placement on the page, and how to deliver that information in a way that aligns with users' decision-making paradigm

  • Gain insight around how users interact with and value the various elements on the page, included, but not limited to the spec charts, scorecard containers, table of contents, and buy buttons

Why a Moderated Usability Test?

Given that our goals were to understand what underpins user behavior, sentiment, and their overall decision-making process, it was important that I had an opportunity to not only understand what users were doing, but also why they were doing it and the emotional dimension of their experience. A moderated usability test would allow me to observe users interacting with our pages in real time and probe into the reasoning behind their reactions. 

Study Design

Pages Tested

As decision-making journeys vary greatly across categories, I chose three drastically different categories in order to gauge how well the template supports different product types




Auto Insurance


Participants Profile

  • 23 Total: 13 Females: 10 Males

  • Participants have purchased or considered purchasing the item being reviewed (laptop, vacuum cleaner, or auto insurance) in the last year

  • Familiarity with popular review sites within each product category and a history of leveraging review sites to inform a purchase decision



Imagine that you are in the market for a laptop/vacuum cleaner/auto insurance. You start your search on Google, read a few articles highlighting the most popular products in that category. After noticing that the Razor Blade 15/Dyson Animal 2/Nationwide auto insurance appeared in multiple reviews, you do a Google search for "Razor Blade 15/Dyson Animal 2/Nationwide auto insurance review"  and click on this result. 

Spend 2-3 minutes exploring the page as if you were evaluating this product based on your own criteria. Feel free to click anything on the page as you normally would, but try and find what you are looking for within the page first. When you have exhausted the content or feel like you would go elsewhere, please stop and let me know. 

Relevant Links/Documents

Study Findings & Design Solutions

Key Findings

High Quality Review Summaries Were Critical to Further Engagement With the Content

Participants sought out and expected summary level data such as pros, cons, or verdicts immediately to assess whether or not they should invest any additional time evaluating the product reviewed. In particular, they were looking for “deal breakers.” 

Evaluation Tools to Compliment Content

Although participants found the editor’s insights and experience handling the product useful, they wanted to be able to assess the product on their own terms by being provided with digestible spec sheets, comparison tools, aggregated customer reviews, and graphs highlighting how the product performs compared to other well known or high performing products in that category. 

Contextually Educating Users Throughout the Review

The participants had varying degrees of familiarity with the product category and as a result, sometimes came across terms they did not understand. Providing accessible way to clarify jargon helped alleviate confusion and made participants more willing to engage with the rest of the content 

Pricing Options Enhance Trust in Buy Buttons

Participants were often skeptical of buy buttons that were not accompanied with additional prices that reinforced that the recommended option was the best and therefore trustworthy. Nearly all participants mentioned that they regularly Google the product name to find unbiased pricing information. Mimicking this experience on-site could potentially deter users from seeking this information out on Google. 

product review gif.gif
Mental Models & User Journeys

After completing the moderated usability study, the design team and I began to construct different mental models of users the design solution needed to support or accommodate in some way. We intentionally abstained from creating personas out of study participants because we had not yet had a chance to compliment findings with a larger scales surveys to solidify certain inferences we had made about our users. Instead, we ascribed each finding to their corresponding mental model to paint a more granular picture of how they might interact with each of the current and future modules on the page.

This allowed us to address detailed questions in our design solution such as:

  • Visualizing Price:Quality Ratio: How can we go beyond simply providing specifications and ratings and instead provide meaningful context around whether or not the product reviews performs well enough for its price point? How can we empower users to easily identify viable options to evaluate further on our site? 

  • Social Proof: As we established that users will reference multiple sites to identify trends in ratings, how can we thoughtfully surface peer expert reviews in order to decrease the need to reference other reviews? Additionally, how can we surface customer reviews in a way that is more actionable than what is found on Amazon? 

  • Contextual Suggestions: Users typically only engage with the subsections of the review that matter to them (battery life for example). In event the product reviewed performed poorly on a given metric, how we can serve up a recommendation that does meet that need without deterring users from considering that product at all. 

  • Price Comparison: How can we display pricing data in a way that users trust and adds value beyond simply showing them the cheapest price available? 

Design Solutions  

Crazy 8s (Design Studio) Sketching Session w/ Design Team and Stakeholders

After I shared the usability test results, I facilitated a session during which each stakeholder had an opportunity to design eight different design solutions and then vote on the designs that resonated with them the most. This provided an opportunity for each department to participate in the design process and voice their ideas, concerns, and priorities. 

Crazy 8s resized .jpg

Wireframing Solutions

Once my fellow designers and I aligned on research findings and stakeholder sentiment on how we might want to approach certain challenges and opportunities, we moved into the wireframing phase. At this point, I collaborated very closely with our UI designers to ideate on the different approaches we could take to the various design and technical challenges we faced to create the solutions that would enhance the Best Picks user experience based on our usability test findings and analytics data.

Some of the agreed upon enhancements included the following: 

  • A more robust summary section that highlights pros/cons and a verdict as well who the product is most suitable for in some cases

  • Surface customer reviews and peer publication ratings

  • Include alternative retailer prices alongside our traditional buy buttons that highlight recommended retailers and display how many users chose each one (social proof)

  • Visualizing price:quality ratio and providing users with an opportunity to easily identify alternatives based on overall performance or on the metrics that matter most to them

  • Provide contextual recommendations based on the section of the article you are engaging with

Performance Snapshot

Performance Snapshot

This performance module serves to provide users with digestible summary level data at a glance while offering them the ability to drill down into the granular details they care most about. Additionally, instead of simply providing product specs, we attempted to contextualize that data in reference to other top performers (best budget, best splurge, best for students, etc.) to aid discoverability and hopefully drive traffic to other reviews.

Perfomance Snapshot v2

Perfomance Snapshot v2

This performance module provides social proof, the ability to see how the product performs across a wide variety of sub-metrics, and a price to quality rating. Users should be able to quickly assess whether this product is potentially a good fit and identify the sections of the review they might want to engage with. Additionally, a table of contents provides an easy way to navigate longer reviews and set expectations for what content can be found on the page.

Visualizing Testing Data

Visualizing Testing Data

Within certain sections of the reviews, visualized testing data modules provide a way for more tech savvy users to assess the product on their own and discover alternatives that might have performed better.

Social Proof: Peer Publication

Social Proof: Peer Publication

A peer reviews module was included in order to meet users need for social proof. Across multiple studies, we have validated that users consult multiple review sites to ensure a product is highly rated by more than one source. However, we also noted that they are only reading a small number of in-depth reviews and then validating that the ratings match up with other sources.

Social Proof: Customer Reviews

Social Proof: Customer Reviews

Featuring customer reviews could enhance trust and streamline the decision-making process. In addition to complimenting the expert review, users would be able to discover granular product and usability details, which was cited as one of the main reasons users seek out reviews.

Competition Module

Competition Module

The goal of this module was to take a more thoughtful approach to instances in which the editor is recommending a competitor product.

Section Specific Recommendation

Section Specific Recommendation

Our studies have concluded that users are validating most products across 2-3 metrics that are important to them. In instances where the product reviewed did not perform well, we created this module to allow editors to occasionally highlight a better alternative for users who highly value this attribute.

Section Specific Recommendation v2

Section Specific Recommendation v2

Our studies have concluded that users are validating most products across 2-3 metrics that are important to them. In instances where the product reviewed did not perform well, we created this module to allow editors to occasionally highlight a better alternative for users who highly value this attribute.

Price Comparison Module

Price Comparison Module

Displaying multiple prices enhanced trust and increased the likelihood of participants to interact with the module. Additionally, we included a social proof dimension to it by highlighting how many users clicked or "picked" each retailer.

Results and Retrospective

Fortunately, due to the highly collaborative nature of our process, there were was almost no pushback during design review with product stakeholders when we presented mockups for the new product review template. The new template provides editors with robust new tools to showcase the products they review, provides users access to nearly all the data they seek out during their decision-making process, and is flexible enough to transition our current product reviews with little to no modification.  However, due to FUTURE's acquisition of Purch, most active projects have been put on hold until further notice.