Learn / Guides / PX insights guide
8 qualitative product metrics to track how users feel about your product
Having a 360-degree view of the customer experience is crucial for product management.
For example, when your customer suddenly exits your product or reports a bug, you might develop several hypotheses about what happened. Product experience insights (PX) help you prove or disprove those hypotheses by giving you a more complete understanding of the customer's experience and their feelings about your product.
To get the deep, full context about your users’ product experience, you need to add a human element—the voice of the customer, or VoC—to your product analytics. Instead of thinking you know what your customers need, you can be sure about it by tracking and analyzing their feedback.
The result? Choosing product changes that benefit your customers (their experience and satisfaction with your product) and your business (retention and revenue) across the board.
Get the insights you need to improve PX
Gain product experience insights that help you make high-impact changes, fast.
Quantitative product metrics are important, but they need context
If you’re already tracking metrics like task time, task completion, NPS, new users, and churn rate, you’re on the right track. These quantitative metrics are essential for knowing what’s happening inside your product.
But if you want to learn why your customers use your product the way they do, these metrics aren’t enough. They lack context and customer input—and without that, it’s easy to make assumptions about how your customers feel when they use your product and what they need from it.
Here are two examples:
Problem #1: some customers don’t complete the task they set out to do (also known as jobs to be done, or JTBD), like sending an email campaign or paying a contractor’s invoice.
The assumptions:
Navigation and next steps weren’t clear.
The CTA for their next step was too low on the page.
The pages were loading too slowly.
Problem #2: your customer churn rate has increased.
The assumptions:
The customers left for a competitor with a cheaper plan.
The customers needed a specific feature that you didn’t offer.
The customers no longer need your type of product.
The challenge?
All of your assumptions might be true.
Or some.
Or none—maybe the real reasons behind customer actions are something you haven’t thought of yet.
The best way to truly connect the dots between what's happening and why it happens is to use PX insights to complement your quantitative metrics. Qualitative product metrics allow you to dig deeper into the product experience and learn how individual users feel and behave while using your product.
PX insights help you see your users’ struggles and moments of delight so you can make the most impactful product decisions.
8 PX metrics to help you improve the customer experience
PX metrics help you see important angles of your user’s journey through your site and product.
To make the following list of metrics easy to understand and implement, let's use a real-world example of a task a user wants to complete:
scheduling an email campaign using an email marketing product.
1. Feedback from customers who completed a task quickly, smoothly, and as you expected
What this metric indicates:
Using the example above: these customers built and scheduled their email campaign in the shortest amount of time possible. When doing so, they followed the steps you defined in your customer journey map (or took a path nearly identical to it).
They logged into your product to create and send their campaign—and did so quickly and seamlessly. Diving into the context of these customers' sessions can help you create smooth experiences for other users, too.
How to find these customers and collect this metric:
Jump into your product analytics tool (like Google Analytics or Mixpanel) and filter sessions by users who reached the scheduled campaign confirmation page. Sort by session length to find those that spent the shortest amount of time to get there.
Find the average task completion time and sort sessions from shortest to longest. Note session lengths for those in the shortest 20-30% sessions.
Then, use a PX insights tool (like Hotjar 👋) to understand what these customers thought, felt, and did as they progressed through their goal in your product.
Here’s how:
Session recordings: watch how users interacted with your product before reaching their goal. Did they go through some steps faster or slower than others? How did they move their mouse and behave on each of the pages? Were there sections they focused on more, or ignored?
Survey responses: set up a survey for customers that reach the task completion page. Make it open-ended—for example, “How was your experience [completing a task] today?”; or use a combination of closed-ended and open-ended questions—for example, “On a scale from 1–5, how easy did we make it for you to [complete a task]?” followed by “What’s the main reason for your score?”
Pro tip: want to ask the right questions in your survey and need some inspiration? We’ve got your back. Check out the Hotjar question bank to find the right questions to ask users in your product experience surveys.
Choose from customer, product survey, and demographic questions—a total of 70+ survey question examples you can use to start collecting product experience insights right away. 🚀
2. Feedback from customers who took longer than expected to complete a task
What this metric indicates:
These customers have scheduled their email campaign, but for one reason or another, it took them a while to get there.
This isn't necessarily a bad sign—they may have been enjoying a cup of really good coffee or chatting with a colleague while getting things done with your product. Of course, we can’t tell you when that’s the case (Hotjar is good, but not that good).
But what PX insights can do is help you learn when and where your product blocks or slows users down when they try to complete a task.
How to find these customers and collect this metric:
Once again, head to GA or use a tool like Mixpanel and filter sessions by those that reached the task completion page.
Find the average task completion time and sort sessions from longest to shortest. Note session lengths for those in the longest 20-30% sessions.
Then, use a tool like Hotjar to analyze these sessions:
Review session recordings longer than X minutes. Look for signs of confusion, frustration, or missing information, which might look like repeated clicks on a page element, jumping back-and-forth between two pages, or endlessly scrolling up and down on the same page. In Hotjar, you can filter by sessions including u-turns or rage clicks to easily spot signs of frustration.
Trigger surveys on pages users linger on. Place surveys on pages with an unusually long session duration, and ask open-ended questions (like “Are you finding what you need on this page?”) so your customers can tell you why they’re stuck.
Launch a Customer Effort Score (CES) survey. Let your customers score how easy or difficult it is to use your product. Set a CES survey up on pages with longer-than-average session durations and analyze responses from long sessions.
Customer Effort Score survey example from our survey template library
3. Feedback from customers who didn’t complete a task
What this metric indicates:
These customers started building their email marketing campaign, but never finished. You might make assumptions about why the users didn’t complete their task: maybe they changed their minds or were confused by something in the product.
The best way to help more customers achieve their goals in your product is to understand what’s stopping them, to begin with.
How to find these customers and collect this metric:
Dig into your product analytics to find the most common exit pages—the last page your customers viewed before they bounced, abandoning their email campaign.
Use these high-exit pages as triggers and filters to collect and analyze qualitative insights. For example:
Review responses from Hotjar's Feedback widget. Exit pages are an excellent place to gather voice of the customer (VoC) feedback—and with the feedback widget, you also get to see a screenshot of what they’re struggling with on the page.
Review session recordings. Filter recordings by exit pages you uncovered in your product analytics. Which pages or sections are users dwelling on? How are they moving their mouse or scrolling on the page right before they exit? How does this complement insight from survey and feedback widget responses?
Set up an exit-intent survey. With Hotjar, you can trigger a survey before your user hits the exit button. This is a great time to ask them why they’re leaving, what they’re missing, and whether they need support from your team.
Exit intent survey example from our survey template library
4. Feedback from customers who experienced friction
What this metric indicates:
Users who experience friction or blockers while creating their email campaign can reveal issues or bugs in your product.
Some common causes of friction for them might be:
An element in the text editor looks clickable but isn’t.
There’s a broken link or button in the drag-and-drop editor.
There’s a UX issue, like an element covering important text in the email templates section.
Some pages distract users from their goal, like design templates for a different type of campaign.
Behavior like rage clicks (repeated clicking) and u-turns (clicking a link and hitting the back button soon after) can reveal opportunities for product improvement you otherwise wouldn’t spot.
How to find these customers and collect this metric:
For this PX insight, watch recordings of sessions that include signs of frustration and a feedback response:
Filter session recordings by rage clicks. Note the pages and elements where rage clicks happened.
Filter session recordings by u-turns. Take note of combinations of pages where users click, land on a new page, and quickly go back.
Trigger a survey on pages that create frustration. Explicitly ask customers what they’re looking for or struggling with on these pages. Hotjar lets you trigger a survey for users who visit a specific page or click on an element.
Review responses from Hotjar's Feedback widget. The feedback widget lets customers rate their experience on a page, click on an element they’re struggling with, and describe the struggle in their own words.
5. Feedback from customers who took an unexpected path
What this metric indicates:
These customers have scheduled and sent an email campaign with your product, but the path they took to get there was different from the one you expected and built your product around.
They may have visited several pages outside of the customer journey map, or started their journey from a page you didn’t plan or predict they would.
Dive into these product experiences to learn why some customers do this. Are they missing a piece of information, so they go searching for it? Are they confused, or trying to complete more than one task in one session?
How to find these customers and collect this metric:
Use a product analytics tool to find all sessions that ended with a task completion. Within those sessions, find users who:
Started the session from a different page than you defined in your customer journey map
Visited more pages than the average user
Make a list of outliers. These could be visits to support doc pages, blog posts, or product features that aren’t directly related to the user's task.
Then, dive into a tool like Hotjar to:
Review session recordings. Follow the user from page to page to see where they hover their mouse (indicating the section they’re looking at), how they scroll, how long they linger on each step, and what happens right before they steer off the expected path.
Set up a survey on outlier pages or pages that precede them. For example, if customers interrupt an important step to open the help section, ask “Are you finding everything you need today?” before they do, and ask a follow-up question if they reply negatively.
Review feedback. Both the outlier pages and those preceding them might hold clues for what your users are missing, looking for, or experiencing as they proceed toward their goal.
6. Feedback from customers with high NPS/CSAT scores
What this metric indicates:
A high Net Promoter Score® (NPS) reflects customers who are likely to recommend your product to others. People who give you a high score—also called promoters—are your most loyal, enthusiastic customers and advocates.
High Customer Satisfaction (CSAT) scores show you short-term customer satisfaction with a recent product or customer service experience.
Feedback from these customers can reveal how your product makes their life easier in the long run, as well as any small but impactful moments in your product that delighted them. With this knowledge, you can improve the product experience for even more customers.
How to find these customers and collect this metric:
Set up an NPS or CSAT survey on your website. Hotjar makes this easy with the CSAT/NPS survey template, which you can grab and send to your customers.
Or: when you install the Hotjar tracking code on your site or app, you can host the survey live instead of on an external URL. This also means you can show the survey only to users on specific pages, or those that signed up a certain number of days ago.
From there, make sure to:
Set up follow-up questions for specific ratings. For example, if a customer gives you a 9 in your NPS survey, you can ask: “How does our product make your job easier?”
Analyze responses to follow-up questions from high scorers. By doing so, you’ll end up with a list of reasons your product makes your users happy, which can help you understand more about customer intent and goals, and can inspire your team during the product planning process.
7. Feedback from customers with low NPS/CSAT scores
What this metric indicates:
A low NPS indicates users who are unlikely to recommend your product to others; they might even actively discourage colleagues or friends from buying from you. These customers are also known as detractors.
Low CSAT scores indicate a poor experience in your product or with a customer service representative.
CSAT scores are generally high, so a dip in overall ratings can indicate a new issue you need to fix. But even with a regularly high CSAT score, you can dig into low ratings to learn how you can help specific users.
How to find these customers and collect this metric:
Set up an NPS survey or a customer satisfaction survey on your website. Use Hotjar’s CSAT/NPS survey template to speed this process up.
Then, make sure to:
Set up follow-up questions to specific ratings. For example, if a customer gives you a 3 in your NPS survey, you can ask: “We’re sorry to hear that! What makes you unhappy with [product name]?”
Analyze responses to follow-up questions from low scorers. Based on these responses, make a list of reasons your product didn’t hit the mark. Some responses might indicate these users weren’t the target customers for your product to begin with, but others will give you ideas for meaningful product improvements.
Filter session recordings by user attributes and low NPS responses. When users indicate specific frustrations, missing features, bugs, or confusing information, watch a recording of their session to see what they experienced in real time. This will help you understand and empathize with them and prioritize the right product improvements and fixes.
8. Feedback from churned customers
What this metric indicates:
Unless you explicitly ask, you can’t know why your customer stopped using your product.
As always, you can make assumptions: it’s too expensive. It’s complicated. Bulky. Missing features. Too slow… and so on.
Product analytics or customer support interactions may reveal parts of the reason for churn, but direct feedback from customers will give you a gold mine of insights and a much deeper context.
How to find these customers and collect this metric:
Set up a churn survey and send it to recently lost customers.
In our churn survey template, we give you a simple phrase to introduce the survey (shown in the image below). From there, we suggest asking your former customers to select their reason for leaving, score your product, and expand on what made them rate your product that way.
Based on responses, you can also analyze session recordings and feedback widget responses from those users and look for frustrations or difficulties they mentioned in the survey.
These unique product insights can show you what preceded the churn. If there are other customers with similar experiences in your product, you can focus on product updates that won’t just prevent their cancellation, but potentially create moments of delight for them.
How to use these metrics to improve PX
The product improvements you make will ultimately depend on what your PX insights and product analytics metrics reveal—but here are a few suggestion product areas to focus on:
Product onboarding, like the number of steps, tooltips with useful tips or knowledge base pages, page length, and CTA placement and copy.
UX issues, like elements that look clickable but aren’t, or important text hidden by another element.
Cancellation page to add information that recently churned users mention they’re missing while using the product.
Customer support processes to nurture more empathy and understanding for commonly frustrating or insufficient product experiences in your support team.
Product roadmap, like the features your customers are missing or want a better experience with.
But remember: there’s no universal product improvement handbook—your customers will tell you what they need from your product. For the best results, focus on your users as you analyze feedback and PX insights, and prioritize the changes that will improve their product experience.
Get the insights you need to improve PX
Gain product experience insights that help you make high-impact changes, fast.