This page looks better on a larger device

It includes embedded Figma prototypes that may not load properly on mobile.

You can keep browsing if you want, but the full experience is easier to follow on a wider screen.

This page looks better on a larger device

It includes embedded Figma prototypes that may not load properly on mobile.

You can keep browsing if you want, but the full experience is easier to follow on a wider screen.

Design Strategy for Session Replay

Design Strategy for Session Replay

Using design strategy to evolve Session Replay into a tool that surfaces meaning over noise.

Client

Amplitude, 2025

Role

Lead Designer

Key activities

Design Strategy, Vibe Coding, Prototyping, Product Design

At Amplitude, we set out to make session replays more insightful and actionable by layering meaningful signals like frustration events, network errors, and AI-generated highlights on top of traditional replay footage. Instead of watching hours of user activity, teams can now jump directly to the moments that matter.

This initiative reshaped how our users understand and act on behavioral data, transforming long, complex replays into clear, story-like narratives.

Session Replay was already a powerful tool — it let teams see user behavior in detail. But as sessions grew longer, that power turned into friction. Analysts and designers were spending too much time scrubbing through lengthy recordings, trying to find the few moments that actually explained a drop-off or frustration. The player felt dense and unintuitive, offering little guidance on where to look or what mattered most. As a result, insights took longer to surface, and adoption remained low.

The goal

Evolve Session Replay from playback to insight delivery, empowering teams to surface friction, errors, and opportunities instantly.

Before any design work began, we defined clear business outcomes to measure success. Our goals were to increase the Session Replay attach rate, raise paid organization adoption, and improve 2+ week retention.

Challenge #1

3 different teams working on 3 different features at the same time

Inside the Session Replay team, we worked across three parallel workstreams, each tackling a core aspect of how users find and interpret insights within a replay:

WORKSTREAM #1

Frustration Analytics

Detected rage clicks and dead clicks to help teams spot user frustration instantly.

WORKSTREAM #2

Error Analytics

Surfaced failed network requests so teams could pinpoint where the experience broke due to technical issues.

WORKSTREAM #3

Jump to Key Moments

Let users open a replay directly at the relevant event from a chart, funnel, or cohort.

It quickly became clear that while this setup increased speed, it also created overlap and inconsistency. No team truly owned the player, so design took the lead in defining its direction and ensuring alignment across initiatives.

Challenge #1

3 different teams working on 3 different features at the same time

Inside the Session Replay team, we worked across three parallel workstreams, each tackling a core aspect of how users find and interpret insights within a replay:

WORKSTREAM #1

Frustration Analytics

Detected rage clicks and dead clicks to help teams spot user frustration instantly.

WORKSTREAM #2

Error Analytics

Surfaced failed network requests so teams could pinpoint where the experience broke due to technical issues.

WORKSTREAM #3

Jump to Key Moments

Let users open a replay directly at the relevant event from a chart, funnel, or cohort.

It quickly became clear that while this setup increased speed, it also created overlap and inconsistency. No team truly owned the player, so design took the lead in defining its direction and ensuring alignment across initiatives.

Challenge #1

3 different teams working on 3 different features at the same time

Inside the Session Replay team, we worked across three parallel workstreams, each tackling a core aspect of how users find and interpret insights within a replay:

WORKSTREAM #1

Frustration Analytics

Detected rage clicks and dead clicks to help teams spot user frustration instantly.

WORKSTREAM #2

Error Analytics

Surfaced failed network requests so teams could pinpoint where the experience broke due to technical issues.

WORKSTREAM #3

Jump to Key Moments

Let users open a replay directly at the relevant event from a chart, funnel, or cohort.

It quickly became clear that while this setup increased speed, it also created overlap and inconsistency. No team truly owned the player, so design took the lead in defining its direction and ensuring alignment across initiatives.

THE STRATEGY

Define design patterns that unify separate feature streams and set direction for how the player evolves, not just what ships next.

I proposed a UX strategy that positioned the player as a shared, scalable system. The strategy outlined how new features should fit together — consistent interaction patterns, modular timeline components, and clear hierarchy rules to guide future releases. It helped establish a unified design direction and laid the foundation for the first Figma exploration boards, where we started to visualize how all these elements could coexist within one cohesive experience.

THE STRATEGY

Define design patterns that unify separate feature streams and set direction for how the player evolves, not just what ships next.

I proposed a UX strategy that positioned the player as a shared, scalable system. The strategy outlined how new features should fit together — consistent interaction patterns, modular timeline components, and clear hierarchy rules to guide future releases. It helped establish a unified design direction and laid the foundation for the first Figma exploration boards, where we started to visualize how all these elements could coexist within one cohesive experience.

THE STRATEGY

Define design patterns that unify separate feature streams and set direction for how the player evolves, not just what ships next.

I proposed a UX strategy that positioned the player as a shared, scalable system. The strategy outlined how new features should fit together — consistent interaction patterns, modular timeline components, and clear hierarchy rules to guide future releases. It helped establish a unified design direction and laid the foundation for the first Figma exploration boards, where we started to visualize how all these elements could coexist within one cohesive experience.

Challenge #2

Designing a player capable of layering rich behavioral data without overwhelming users — surfacing clear, scalable, and meaningful signals in a limited space.

Challenge #2

Designing a player capable of layering rich behavioral data without overwhelming users — surfacing clear, scalable, and meaningful signals in a limited space.

Challenge #2

Designing a player capable of layering rich behavioral data without overwhelming users — surfacing clear, scalable, and meaningful signals in a limited space.

To explore how signals could coexist within the limited space of the player, I tested several layout directions. Each aimed to balance clarity, precision, and scalability, finding a way to surface meaningful events without overwhelming users or compromising navigation.

The following explorations show how different approaches performed against that goal.

You can interact with the prototypes below — hover, click, and explore how the player works in real time.

For a better viewing experience, it's recommended to open this page on a wider device.

None of the approaches above felt like the right solution. Each solved something but failed to balance clarity, scalability, and simplicity.

Some made signals easy to see but difficult to navigate; others reduced clutter but lost meaning. It became clear the problem wasn’t visual styling — it was structural.

Even though these explorations weren’t the final answer, they revealed what mattered and set the foundation for the final direction.

The Final Direction

The breakthrough came from rethinking the timeline as a dynamic surface instead of a fixed-size strip. By using hover as the trigger to expand the timeline, we unlocked a large amount of real estate exactly when users needed precision, and kept it minimal when they didn’t.

In its resting state, the timeline shows clean, bold markers that make signals easy to scan. On hover, it expands into a full interaction zone, giving users ample room to target events, scrub accurately, and explore dense sequences without visual clutter.

This interaction allowed us to combine the strengths of every previous exploration — clarity, scalability, precision, and simplicity — in one elegant, behavior-driven solution.

The breakthrough came from rethinking the timeline as a dynamic surface instead of a fixed-size strip. By using hover as the trigger to expand the timeline, we unlocked a large amount of real estate exactly when users needed precision, and kept it minimal when they didn’t.

In its resting state, the timeline shows clean, bold markers that make signals easy to scan. On hover, it expands into a full interaction zone, giving users ample room to target events, scrub accurately, and explore dense sequences without visual clutter.

This interaction allowed us to combine the strengths of every previous exploration — clarity, scalability, precision, and simplicity — in one elegant, behavior-driven solution.

The breakthrough came from rethinking the timeline as a dynamic surface instead of a fixed-size strip. By using hover as the trigger to expand the timeline, we unlocked a large amount of real estate exactly when users needed precision, and kept it minimal when they didn’t.

In its resting state, the timeline shows clean, bold markers that make signals easy to scan. On hover, it expands into a full interaction zone, giving users ample room to target events, scrub accurately, and explore dense sequences without visual clutter.

This interaction allowed us to combine the strengths of every previous exploration — clarity, scalability, precision, and simplicity — in one elegant, behavior-driven solution.

Challenge #3

Translating a complex system into a tangible, interactive experience

As the player concept matured, bringing it to life in a prototype became a challenge of its own. The experience relied heavily on micro-interactions — subtle transitions, scrubbing behaviors, and dynamic signal states that were difficult to convey with static frames or conventional prototyping tools. I needed a way to demonstrate how the player felt, not just how it looked.

Around that time, Figma released Make, and the timing couldn’t have been better. It gave me the precision and flexibility to simulate every interaction exactly as envisioned, allowing me to build a prototype that faithfully represented the player’s behavior. This not only made the concept immediately understandable to stakeholders but also saved considerable design and engineering time.

Challenge #3

Translating a complex system into a tangible, interactive experience

As the player concept matured, bringing it to life in a prototype became a challenge of its own. The experience relied heavily on micro-interactions — subtle transitions, scrubbing behaviors, and dynamic signal states that were difficult to convey with static frames or conventional prototyping tools. I needed a way to demonstrate how the player felt, not just how it looked.

Around that time, Figma released Make, and the timing couldn’t have been better. It gave me the precision and flexibility to simulate every interaction exactly as envisioned, allowing me to build a prototype that faithfully represented the player’s behavior. This not only made the concept immediately understandable to stakeholders but also saved considerable design and engineering time.

Challenge #3

Translating a complex system into a tangible, interactive experience

As the player concept matured, bringing it to life in a prototype became a challenge of its own. The experience relied heavily on micro-interactions — subtle transitions, scrubbing behaviors, and dynamic signal states that were difficult to convey with static frames or conventional prototyping tools. I needed a way to demonstrate how the player felt, not just how it looked.

Around that time, Figma released Make, and the timing couldn’t have been better. It gave me the precision and flexibility to simulate every interaction exactly as envisioned, allowing me to build a prototype that faithfully represented the player’s behavior. This not only made the concept immediately understandable to stakeholders but also saved considerable design and engineering time.

You can interact with the prototype below — hover, click, and explore how the player works in real time.

For a better viewing experience, it's recommended to open this page on a wider device.

This was the first fully functional prototype I built to bring the player concept to life.
It showcased the key interactions that defined the experience:

Play and pause the scrubber

Hover over the timeline to preview events

Play and pause the scrubber

Use keyboard shortcuts for playback control

Show/hide events from the settings menu

Adjust playback speed

Skip 10 seconds backward or forward

Click markers to jump to specific moments

Beyond the Player

Unifying playback, timeline, and AI insights into one continuous experience

To make the signals truly useful, they also needed to appear in the surrounding parts of the Session Replay experience — especially in the Events Timeline, one of the player’s main side panels. Adding visual cues there was essential to strengthen the connection between what users saw on the playback scrubber and what they explored in the timeline. It ensured that signals like frustration or errors were instantly recognizable, no matter where users looked.

We also extended the experience to include AI Summaries and Highlights, which already existed as a separate tab within Session Replay. Because the player was built on a scalable UX framework, those same highlights could now appear directly in the playback view. This created a seamless bridge between AI-driven insights and real session context, turning Session Replay into a unified, signal-driven ecosystem.

We also extended the experience to include AI Summaries and Highlights, which already existed as a separate tab within Session Replay. Because the player was built on a scalable UX framework, those same highlights could now appear directly in the playback view. This created a seamless bridge between AI-driven insights and real session context, turning Session Replay into a unified, signal-driven ecosystem.

We also extended the experience to include AI Summaries and Highlights, which already existed as a separate tab within Session Replay. Because the player was built on a scalable UX framework, those same highlights could now appear directly in the playback view. This created a seamless bridge between AI-driven insights and real session context, turning Session Replay into a unified, signal-driven ecosystem.

The impact

Designing for scale early on paid off, enabling seamless launches and driving adoption across Amplitude’s user base

Each feature stream launched independently: Error Analytics, then Jump to Key Moments, and finally Frustration Analytics, following each team’s readiness.

Despite the staggered timeline, the player absorbed every release seamlessly, validating the scalability of the UX framework. The structure we built early on made integration effortless: new features plugged into existing patterns without rework or inconsistencies.

This same foundation later enabled the addition of AI Summary Highlights, which automatically surfaces key session insights. Its integration required almost no design or engineering effort, proving the system could evolve without added complexity — a direct outcome of designing for scale from the start.

The impact

Designing for scale early on paid off, enabling seamless launches and driving adoption across Amplitude’s user base

Each feature stream launched independently: Error Analytics, then Jump to Key Moments, and finally Frustration Analytics, following each team’s readiness.

Despite the staggered timeline, the player absorbed every release seamlessly, validating the scalability of the UX framework. The structure we built early on made integration effortless: new features plugged into existing patterns without rework or inconsistencies.

This same foundation later enabled the addition of AI Summary Highlights, which automatically surfaces key session insights. Its integration required almost no design or engineering effort, proving the system could evolve without added complexity — a direct outcome of designing for scale from the start.

The impact

Designing for scale early on paid off, enabling seamless launches and driving adoption across Amplitude’s user base

Each feature stream launched independently: Error Analytics, then Jump to Key Moments, and finally Frustration Analytics, following each team’s readiness.

Despite the staggered timeline, the player absorbed every release seamlessly, validating the scalability of the UX framework. The structure we built early on made integration effortless: new features plugged into existing patterns without rework or inconsistencies.

This same foundation later enabled the addition of AI Summary Highlights, which automatically surfaces key session insights. Its integration required almost no design or engineering effort, proving the system could evolve without added complexity — a direct outcome of designing for scale from the start.

15.7%

Attach Rate

7%16%

71.4%

Paid Org Adoption

22% → 75%

45.3%

2+ Week Retention

40% → 45%

Although Paid Org Adoption fell short of the initial goal, the jump we achieved demonstrated that the new player was resonating with teams, especially in complex, longer-cycle organizations. It clarified where to focus next and confirmed the strategy was directionally right.

Get in
touch

RUBENSCASTRO

Get in
touch

RUBENSCASTRO

Get in
touch

RUBENSCASTRO