Overview
An in-depth analysis of Screen Recordings (product), using surveys, analyzing the usage of manual tags and comments, running customer interviews, to clarify and validate what makes recordings worth watching, as well as determine how customers act once they find an insight.
In our research, we found that product teams watch recordings to identify issues and prioritize them. In particular, they watch recordings after launching a new release to determine whether it's working or not. It takes a lot of manual effort for insights to be acted upon. That is, if they actually come across recordings that are relevant to them.
Sticker labels were introduced to make it easy for users to flag relevant moments in recordings, as well as to power the recommendations engine to recommend relevant recordings.
Context
Hotjar equips product teams with Product Experience Insights, showing them how users behave and what they feel strongly about, so those teams can deliver real value.
Recordings allows you to watch real user sessions to see exactly what your users see and to find and fix hidden friction and conversion blockers.
The Challenge: From Data Overload to Actionable Insights
Hotjar’s session recordings are a goldmine for user behavior insights. However, with thousands of recordings generated daily, users often struggled to identify sessions worth watching. This led to missed opportunities and underutilized data.
We needed to shift from “record everything” to “surface what matters.”

The Process: From Noise to Signals
To understand what made a session "relevant," we:
Conducted surveys and user interviews with product managers and designers to understand their goals when watching recordings.
Understand workflows for analyzing and sharing insights
Collaborated with the BI & Analytics team to analyze correlations between session metadata and user engagement actions like favoriting and sharing.
Validated our proxy success metrics to ensure they aligned with user perceptions of valuable recordings.
Validated early concepts through prototypes and iterative feedback

Explorations…

The Solution: A Smarter, Simpler Way to Find Insights
We introduced a lightweight but powerful system:
Stickers: A fast way for users to mark key moments in recordings
AI relevance model: Used sticker activity and behavioral cues to surface sessions likely to contain insight
Smart filters: Let users narrow down recordings by patterns and events that matter to them
It worked seamlessly with existing behavior tools, making it easier than ever to find the “aha” moments.


My Role: Leading Design, Research & Delivery
I led the end-to-end product design—from discovery to delivery—across:
User research & workshops with cross-functional teams
Design strategy & prototypes aligned to real user jobs
Collaboration with engineers, product leads, and analytics
Learnings
Iterate fast, learn faster – Prioritize progress over perfection. Continuous user feedback helped us improve quickly and stay user-focused.
Let data guide you – Talking to users revealed unexpected behaviors and language, leading to more intuitive, user-aligned decisions.
Collaboration drives clarity – Cross-team alignment, especially with BI and other product areas, ensured a consistent and valuable user experience.