Case Study: CR Ratings Proof of Concept
Consumer Reports(CR) is a non-profit organization committed to protecting consumers by advocating for a fairer marketplace. CR achieves this through rigorous testing of consumer products and cars and publishing their ratings monthly. Our team was tasked with re-envisioning CR rating charts to be customizable and more user-friendly across all digital platforms with an emphasis on page speed and a mobile-first approach. Additionally, we had the challenge of rearchitecting the backend codebase from the ground up to utilize more flexible components and maximize page speed.
The problem we faced was modernizing our rating product and making the users' research journey relatable to their lifestyle in an organization that relies heavily on the scientific method, fact-checking, and an overly cautious approach to change. As the Lead UX designer on the team, I worked in lockstep with our consumer insights group to synthesize research findings into a more modern, practical, and insightful user experience. Working with our product owners, QA analysts, and Development team, we set about creating a proof of concept to run live tests against and measure the impact the new experience had on our core users.
Primary Users
4.5 Million Subscribers aged 28-65
CR Customer Service reps
AskCR Reps
Prototypes for User Testing
V1 Save products concept: View Desktop Prototype
V2 Compare products: View Desktop Prototype, View Mobile Prototype
Ratings Charts pre-redesign
Ratings Charts Post Redesign
The Challenge
After conducting usability interviews and analyzing user feedback, it became clear the CR rating charts were confusing for users to scan and comprehend at a glance. An arcane system of symbols, combined with dense tables of data designed for the printed page all contributed to users being overwhelmed with CRs core digital product.
Also, users found it challenging to understand the filters that were available to them and were unable to clearly and quickly narrow down the products list to fit their desired needs. Our testing engineers created the filters using data from their test protocols; this left users confused and frustrated.
Key Tests:
Icon ratings Vs. Bar ratings
Rating scale variations: 1/5, 1/0, 1/100
Personalized filters
Full view Vs. Compact view
CR Ratings Visualization
Consumer Reports has relied on a system of icons to communicate their ratings for over 40 years. These icons were crucial to the magazine's ability to deliver a large amount of data within the fixed space of the printed page. As CRs web presence developed early on these icons were adopted as-is for use on the website without any consideration for how users interacted with them.
We conducted user interviews as well as usability tests on our digital rating charts and found that a majority of respondents had difficulty understanding the ratings at a glance or were completely confused by the icon system altogether. The rating icons also presented a barrier for acquiring new users as the rating system was uninviting for more modern audiences.
Rating scales
After testing several options for a new way to present our rating data, we decided to move forward with Bar Charts in place of our old icon-based system. In addition to the new visual presentation of CRs ratings, we tested several approaches to the scale that we rate products on with both subscribers and non-subscribers.
A majority of respondents preferred the 1-100 scale as it provided a higher level of granularity by which to compare rated products. Ultimately we had to rollback the rating scale to a 1-10 scale as the 1-100 scale posed internal stakeholder concerns to applying the 1-100 scale across all 126 product categories.
Filtering
CRs filters were historically created by our testing engineers, which consist of all of the testing criteria applied to the product. The problem for users was that many of the testing criteria were not presented in a way that made sense for digital filters. The filter criteria was a 1:1 reflection of what we tested on any given product. This approach resulted in filters that were confusing and broke from standard practices for filtering across the web. Attributes like "versatility" for a TV were confusing and not very relevant to how consumers think about these products. There was also the strange inclusion of Yes, No, Don't Care as selection options for specific features when a simple on/off checkbox would have sufficed.
After doing user interviews to pin-point what users look for when shopping for vacuums, combined with a competitive analysis of filtering across e-commerce sites, we completely redesigned a more personalized approach to our filters. The new filters allow users to choose what's important to them in a more intuitive way, for example, how well a vacuum cleans up pet hair, or how it handles different types of flooring. In addition to these lifestyle filters, we updated the, Yes, No, Don't Care selections to the more intuitive Yes/No checkbox.
Views
After our initial round of user testing, we presented the winning concept to our internal stakeholders. In doing so, we discovered a new type of user internal to CR, our customer service agents, and our AskCR concierge customer chat agents. Both teams deal directly with our customers. Our customer service agents deal with users that are having a range of issues with our products from login to finding ratings. Our Ask CR reps are a crucial part of our new membership initiative. They provide a concierge service that helps users find the best product ratings and comparison charts based on user input. After meeting with these reps, we discovered that they use the Features & Specs tab that we removed during the simplification of or ratings.
This simplification posed a new challenge! How can we deliver a more scannable and easy to comprehend product to our legacy users who want a high level of information but still provide the robust level of detail that our AskCR reps need to do their job and support the AskCR product?
Our solution for this was to provide all users with three views, Compact, Full, and Model. The default view would be our simplified "Compact" view, with an option to toggle to the more detailed "Full" view.
Rating Compact View
This view was received an 88% customer satisfaction score making the overwhelming winner of our user tests. The scannable bar charts were more comfortable to digest at-a-glance, and the personalized filters helped users narrow their choices faster.
Rating Full View
The “full” view was considered by most users to be denser and took them more time to comprehend. It did, however, beat the older data-dense versions of CRs rating chart by 22% in user tests.
Model View
Once users navigated to the model page of a chosen product, we maintained the ability to move around in the filtered set of products from previous views by introducing a new product carousel at the top of the page.
The new carousel allowed users to remain at the more granular level of model pages but still easily compare other models in the rating set.
Compare Chart
The compare chart is a core feature for CR members and internal users alike. Our AskCR representatives use this page to quickly narrow choices for users during live chat sessions. Our reps will also send direct links to live chat users so they can continue comparing on their own.
The revamped compare chart allows users to compare up to 5 products side-by-side, the previous iteration only allowed for comparing three products. We also created the ability for users to highlight fundamental differences in tech specs of the products.