Designing for Clarity: How We Transformed Data Visualizations to Speak Human
“Why can’t I just see how my bot is performing? I don’t want to decode charts every time.”
During a feedback call, a Customer Success Manager had 12 dashboards open, charts flooding her screen, yet she was still exporting to Excel — a clear sign our tool wasn’t simplifying analysis but complicating it. This case study shows how we turned that frustration into confidence by rethinking not just how data was displayed, but how it was experienced.
Note: I’ve used randomly generated data in the screenshots to protect client confidentiality. The visuals show real features, but the content is illustrative.

THE PROBLEM
Outdated dashboards were slowing teams down instead of helping them decide.
My Role
User research and synthesis, ideation, prototyping, interaction, usability testing
Team
Lead Product Designer
Timeline
6 months
RESEARCH
Before exploring redesigns, we audited our product’s usability and market position; setting the foundation for future decisions. We focused on:

Heuristic evaluation
We used Nielsen’s 10 principles to spot issues like unclear terms, clutter, and missing feedback.

Competitive Benchmarking
We benchmarked tools like ThoughtSpot, Metabase, Mixpanel, Power BI, and Tableau to spot must-haves.
We then brought in voices from the field. Through 1:1 conversations with analysts and managers, we collected stories, struggles, and workarounds.
“I want to rename R1 and R2 — what do those even mean to someone outside the dev team?”
“Can I add a goal line so I know if we’ve hit 80% resolution?”
“Colour coding of values on the charts makes no sense.”
“I can never find how to rename an axis.”
“I have to hover over the chart to get insights from the report.”
“How can I visualise user traffic on a heatmap?”
“How can I edit the labels on the charts to make it easy to understand?”
This funnel drops off sharply, but I can’t tell why — the stages don’t say much

Learning
Design isn’t about adding features, but removing barriers. Talking to both power and non-power users
These exercises showed me the importance of listening to users rather than guessing their needs. Power and non-power users approached the product differently, revealing distinct challenges. Competitive benchmarks and feature requests weren’t enough — real insights came from observing actual usage and pain points. Small issues like unclear labels or missing goal lines highlighted the need for clarity and trust. Ultimately, I learned that good design starts with identifying the right problems and solving them in meaningful ways.
Targeted pain points
Turning User Pain Points into a Clear Design Direction
Limited Visualization Options
Without advanced visuals like heatmaps or geo charts, users struggled to analyze data as fully as competitors allowed.
Poor UI/UX Design
Unappealing dashboards with limited chart options make insights harder to extract and reduce usability
Inefficient Insight Generation
Unclear labels, poor colors, and missing features like goal lines make data hard to interpret quickly.
Reliance on External Tools
Non-power users frequently turn to tools like Power BI for better presentation and analysis, highlighting gaps in our platform’s capabilities.
Design
The redesign made charts clearer and more intuitive with better grouping, plain-language labels, and improved contrast. Panels were simplified, and interactions streamlined for faster, smoother use.
Prioritize by Impact
We grouped user feedback into Functional Must-Haves and UX Upgrades, prioritizing features that helped users spot issues faster, benchmark visually, and tailor their storytelling ensuring we focused on high-impact improvements first.
Introduce New Visuals
To go beyond basic charts, we added heatmaps, stacked rows, geo maps, and KPI blocks with colour alerts and goal thresholds all optimized for low-latency performance and clearer storytelling.



Simplify
Power users needed depth; others needed clarity. We used progressive disclosure — showing basics upfront, with advanced options (like filters, log scales, and trend lines) revealed only when needed. This kept the UI clean without limiting capability.

Usertest
We tested the redesign with power and non-power users to ensure it worked across skill levels.
We gave participants a working prototype and asked them to complete common tasks like setting goal lines, renaming axes, and exploring different configuration tabs. This helped us validate whether the new structure, labels, and interactions were intuitive.

The testing confirmed that the redesign improved clarity, reduced setup time, and made insights easier to extract. While power users appreciated advanced options, non-power users finally felt comfortable navigating and customizing dashboards without relying on exports.
What We Learned
Users easily understood the new tab structure and clearer terminology
Plain-language labels reduced confusion and boosted confidence
Advanced tasks (like split axes) still required guidance, revealing where tooltips and education were needed


Impact
The redesign made insights clearer, boosted user confidence, and cut analysis time.
40%
reduction in setup time for configuring dashboards
30%
of non-power users could interpret insights without exporting to Excel
10%
increase in adoption of in-product dashboards vs external tools
60%
of participants reported higher confidence in understanding data
Future Innovations and Growth Opportunities
This redesign marked the biggest leap for Insights in years. Our aim was simple: make data easier to read, explore, and act on. Since launch, we’ve been checking in with users and tracking behavior in PostHog to see what’s working. Of course, not everything’s perfect yet—some settings are still tricky, and a few labels could be clearer. But we’re already looking ahead: adding smarter ways to surface trends, tailoring views to different roles, and using AI to answer questions with simple, clear visuals. The goal is to go beyond showing data to actually help people understand and use it with confidence.
Thanks for reading! :)
Designing for Clarity: How We Transformed Data Visualizations to Speak Human
“Why can’t I just see how my bot is performing? I don’t want to decode charts every time.”
During a feedback call, a Customer Success Manager had 12 dashboards open, charts flooding her screen, yet she was still exporting to Excel — a clear sign our tool wasn’t simplifying analysis but complicating it. This case study shows how we turned that frustration into confidence by rethinking not just how data was displayed, but how it was experienced.
Note: I’ve used randomly generated data in the screenshots to protect client confidentiality. The visuals show real features, but the content is illustrative.

THE PROBLEM
Outdated dashboards were slowing teams down instead of helping them decide.
My Role
User research and synthesis, ideation, prototyping, interaction, usability testing
Team
Lead Product Designer
Timeline
6 months
RESEARCH
Before exploring redesigns, we audited our product’s usability and market position; setting the foundation for future decisions. We focused on:

Heuristic evaluation
We used Nielsen’s 10 principles to spot issues like unclear terms, clutter, and missing feedback.

Competitive Benchmarking
We benchmarked tools like ThoughtSpot, Metabase, Mixpanel, Power BI, and Tableau to spot must-haves.
We then brought in voices from the field. Through 1:1 conversations with analysts and managers, we collected stories, struggles, and workarounds.
“I want to rename R1 and R2 — what do those even mean to someone outside the dev team?”
“Can I add a goal line so I know if we’ve hit 80% resolution?”
“Colour coding of values on the charts makes no sense.”
“I can never find how to rename an axis.”
“I have to hover over the chart to get insights from the report.”
“How can I visualise user traffic on a heatmap?”
“How can I edit the labels on the charts to make it easy to understand?”
This funnel drops off sharply, but I can’t tell why — the stages don’t say much

Learning
Design isn’t about adding features, but removing barriers. Talking to both power and non-power users
These exercises showed me the importance of listening to users rather than guessing their needs. Power and non-power users approached the product differently, revealing distinct challenges. Competitive benchmarks and feature requests weren’t enough — real insights came from observing actual usage and pain points. Small issues like unclear labels or missing goal lines highlighted the need for clarity and trust. Ultimately, I learned that good design starts with identifying the right problems and solving them in meaningful ways.
Targeted pain points
Turning User Pain Points into a Clear Design Direction
Limited Visualization Options
Without advanced visuals like heatmaps or geo charts, users struggled to analyze data as fully as competitors allowed.
Poor UI/UX Design
Unappealing dashboards with limited chart options make insights harder to extract and reduce usability
Inefficient Insight Generation
Unclear labels, poor colors, and missing features like goal lines make data hard to interpret quickly.
Reliance on External Tools
Non-power users frequently turn to tools like Power BI for better presentation and analysis, highlighting gaps in our platform’s capabilities.
Design
The redesign made charts clearer and more intuitive with better grouping, plain-language labels, and improved contrast. Panels were simplified, and interactions streamlined for faster, smoother use.
Prioritize by Impact
We grouped user feedback into Functional Must-Haves and UX Upgrades, prioritizing features that helped users spot issues faster, benchmark visually, and tailor their storytelling ensuring we focused on high-impact improvements first.
Introduce New Visuals
To go beyond basic charts, we added heatmaps, stacked rows, geo maps, and KPI blocks with colour alerts and goal thresholds all optimized for low-latency performance and clearer storytelling.



Conversation analysis
We connected insights to real conversations so teams could see exactly where the bot failed and why. By pairing full transcripts with AI analysis, users got both the story and the signal in one place.
Searchable queries, full transcripts, and AI tags are all shown side by side.
Unresolved chats are flagged instantly, making failures easy to spot.
Raw conversations build trust, while reducing screen-hopping speeds up diagnosis and decisions.

Usertest
We tested the redesign with power and non-power users to ensure it worked across skill levels.
We gave participants a working prototype and asked them to complete common tasks like setting goal lines, renaming axes, and exploring different configuration tabs. This helped us validate whether the new structure, labels, and interactions were intuitive.

The testing confirmed that the redesign improved clarity, reduced setup time, and made insights easier to extract. While power users appreciated advanced options, non-power users finally felt comfortable navigating and customizing dashboards without relying on exports.
What We Learned
Users easily understood the new tab structure and clearer terminology
Plain-language labels reduced confusion and boosted confidence
Advanced tasks (like split axes) still required guidance, revealing where tooltips and education were needed


Impact
The redesign made insights clearer, boosted user confidence, and cut analysis time.
40%
reduction in setup time for configuring dashboards
30%
of non-power users could interpret insights without exporting to Excel
10%
increase in adoption of in-product dashboards vs external tools
60%
of participants reported higher confidence in understanding data
Future Innovations and Growth Opportunities
This redesign marked the biggest leap for Insights in years. Our aim was simple: make data easier to read, explore, and act on. Since launch, we’ve been checking in with users and tracking behavior in PostHog to see what’s working. Of course, not everything’s perfect yet—some settings are still tricky, and a few labels could be clearer. But we’re already looking ahead: adding smarter ways to surface trends, tailoring views to different roles, and using AI to answer questions with simple, clear visuals. The goal is to go beyond showing data to actually help people understand and use it with confidence.
Thanks for reading! :)
Designing for Clarity: How We Transformed Data Visualizations to Speak Human
“Why can’t I just see how my bot is performing? I don’t want to decode charts every time.”
During a feedback call, a Customer Success Manager had 12 dashboards open, charts flooding her screen, yet she was still exporting to Excel — a clear sign our tool wasn’t simplifying analysis but complicating it. This case study shows how we turned that frustration into confidence by rethinking not just how data was displayed, but how it was experienced.
Note: I’ve used randomly generated data in the screenshots to protect client confidentiality. The visuals show real features, but the content is illustrative.

THE PROBLEM
Outdated dashboards were slowing teams down instead of helping them decide.
My Role
User research and synthesis, ideation, prototyping, interaction, usability testing
Team
Lead Product Designer
Timeline
6 months
RESEARCH
Before exploring redesigns, we audited our product’s usability and market position; setting the foundation for future decisions. We focused on:

Heuristic evaluation
We used Nielsen’s 10 principles to spot issues like unclear terms, clutter, and missing feedback.

Competitive Benchmarking
We benchmarked tools like ThoughtSpot, Metabase, Mixpanel, Power BI, and Tableau to spot must-haves.
We then brought in voices from the field. Through 1:1 conversations with analysts and managers, we collected stories, struggles, and workarounds.
“I want to rename R1 and R2 — what do those even mean to someone outside the dev team?”
“Can I add a goal line so I know if we’ve hit 80% resolution?”
“Colour coding of values on the charts makes no sense.”
“I can never find how to rename an axis.”
“I have to hover over the chart to get insights from the report.”
“How can I visualise user traffic on a heatmap?”
“How can I edit the labels on the charts to make it easy to understand?”
This funnel drops off sharply, but I can’t tell why — the stages don’t say much

Learning
Design isn’t about adding features, but removing barriers. Talking to both power and non-power users
These exercises showed me the importance of listening to users rather than guessing their needs. Power and non-power users approached the product differently, revealing distinct challenges. Competitive benchmarks and feature requests weren’t enough — real insights came from observing actual usage and pain points. Small issues like unclear labels or missing goal lines highlighted the need for clarity and trust. Ultimately, I learned that good design starts with identifying the right problems and solving them in meaningful ways.
Targeted pain points
Turning User Pain Points into a Clear Design Direction
Limited Visualization Options
Without advanced visuals like heatmaps or geo charts, users struggled to analyze data as fully as competitors allowed.
Poor UI/UX Design
Unappealing dashboards with limited chart options make insights harder to extract and reduce usability
Inefficient Insight Generation
Unclear labels, poor colors, and missing features like goal lines make data hard to interpret quickly.
Reliance on External Tools
Non-power users frequently turn to tools like Power BI for better presentation and analysis, highlighting gaps in our platform’s capabilities.
Design
The redesign made charts clearer and more intuitive with better grouping, plain-language labels, and improved contrast. Panels were simplified, and interactions streamlined for faster, smoother use.
Prioritize by Impact
We grouped user feedback into Functional Must-Haves and UX Upgrades, prioritizing features that helped users spot issues faster, benchmark visually, and tailor their storytelling ensuring we focused on high-impact improvements first.
Introduce New Visuals
To go beyond basic charts, we added heatmaps, stacked rows, geo maps, and KPI blocks with colour alerts and goal thresholds all optimized for low-latency performance and clearer storytelling.



Simplify
Power users needed depth; others needed clarity. We used progressive disclosure — showing basics upfront, with advanced options (like filters, log scales, and trend lines) revealed only when needed. This kept the UI clean without limiting capability.

Usertest
We tested the redesign with power and non-power users to ensure it worked across skill levels.
We gave participants a working prototype and asked them to complete common tasks like setting goal lines, renaming axes, and exploring different configuration tabs. This helped us validate whether the new structure, labels, and interactions were intuitive.

The testing confirmed that the redesign improved clarity, reduced setup time, and made insights easier to extract. While power users appreciated advanced options, non-power users finally felt comfortable navigating and customizing dashboards without relying on exports.
What We Learned
Users easily understood the new tab structure and clearer terminology
Plain-language labels reduced confusion and boosted confidence
Advanced tasks (like split axes) still required guidance, revealing where tooltips and education were needed


Impact
The redesign made insights clearer, boosted user confidence, and cut analysis time.
40%
reduction in setup time for configuring dashboards
30%
of non-power users could interpret insights without exporting to Excel
10%
increase in adoption of in-product dashboards vs external tools
60%
of participants reported higher confidence in understanding data
Future Innovations and Growth Opportunities
This redesign marked the biggest leap for Insights in years. Our aim was simple: make data easier to read, explore, and act on. Since launch, we’ve been checking in with users and tracking behavior in PostHog to see what’s working. Of course, not everything’s perfect yet—some settings are still tricky, and a few labels could be clearer. But we’re already looking ahead: adding smarter ways to surface trends, tailoring views to different roles, and using AI to answer questions with simple, clear visuals. The goal is to go beyond showing data to actually help people understand and use it with confidence.
Thanks for reading! :)
Designing for Clarity: How We Transformed Data Visualizations to Speak Human
“Why can’t I just see how my bot is performing? I don’t want to decode charts every time.”
During a feedback call, a Customer Success Manager had 12 dashboards open, charts flooding her screen, yet she was still exporting to Excel — a clear sign our tool wasn’t simplifying analysis but complicating it. This case study shows how we turned that frustration into confidence by rethinking not just how data was displayed, but how it was experienced.
Note: I’ve used randomly generated data in the screenshots to protect client confidentiality. The visuals show real features, but the content is illustrative.

THE PROBLEM
Outdated dashboards were slowing teams down instead of helping them decide.
My Role
User research and synthesis, ideation, prototyping, interaction, usability testing
Team
Lead Product Designer
Timeline
6 months
RESEARCH
Before exploring redesigns, we audited our product’s usability and market position; setting the foundation for future decisions. We focused on:

Heuristic evaluation
We used Nielsen’s 10 principles to spot issues like unclear terms, clutter, and missing feedback.

Competitive Benchmarking
We benchmarked tools like ThoughtSpot, Metabase, Mixpanel, Power BI, and Tableau to spot must-haves.
We then brought in voices from the field. Through 1:1 conversations with analysts and managers, we collected stories, struggles, and workarounds.
“I want to rename R1 and R2 — what do those even mean to someone outside the dev team?”
“Can I add a goal line so I know if we’ve hit 80% resolution?”
“Colour coding of values on the charts makes no sense.”
“I can never find how to rename an axis.”
“I have to hover over the chart to get insights from the report.”
“How can I visualise user traffic on a heatmap?”
“How can I edit the labels on the charts to make it easy to understand?”
This funnel drops off sharply, but I can’t tell why — the stages don’t say much

Learning
Design isn’t about adding features, but removing barriers. Talking to both power and non-power users
These exercises showed me the importance of listening to users rather than guessing their needs. Power and non-power users approached the product differently, revealing distinct challenges. Competitive benchmarks and feature requests weren’t enough — real insights came from observing actual usage and pain points. Small issues like unclear labels or missing goal lines highlighted the need for clarity and trust. Ultimately, I learned that good design starts with identifying the right problems and solving them in meaningful ways.
Targeted pain points
Turning User Pain Points into a Clear Design Direction
Limited Visualization Options
Without advanced visuals like heatmaps or geo charts, users struggled to analyze data as fully as competitors allowed.
Poor UI/UX Design
Unappealing dashboards with limited chart options make insights harder to extract and reduce usability
Inefficient Insight Generation
Unclear labels, poor colors, and missing features like goal lines make data hard to interpret quickly.
Reliance on External Tools
Non-power users frequently turn to tools like Power BI for better presentation and analysis, highlighting gaps in our platform’s capabilities.
Design
The redesign made charts clearer and more intuitive with better grouping, plain-language labels, and improved contrast. Panels were simplified, and interactions streamlined for faster, smoother use.
Prioritize by Impact
We grouped user feedback into Functional Must-Haves and UX Upgrades, prioritizing features that helped users spot issues faster, benchmark visually, and tailor their storytelling ensuring we focused on high-impact improvements first.
Introduce New Visuals
To go beyond basic charts, we added heatmaps, stacked rows, geo maps, and KPI blocks with colour alerts and goal thresholds all optimized for low-latency performance and clearer storytelling.



Simplify
Power users needed depth; others needed clarity. We used progressive disclosure — showing basics upfront, with advanced options (like filters, log scales, and trend lines) revealed only when needed. This kept the UI clean without limiting capability.

Usertest
We tested the redesign with power and non-power users to ensure it worked across skill levels.
We gave participants a working prototype and asked them to complete common tasks like setting goal lines, renaming axes, and exploring different configuration tabs. This helped us validate whether the new structure, labels, and interactions were intuitive.

The testing confirmed that the redesign improved clarity, reduced setup time, and made insights easier to extract. While power users appreciated advanced options, non-power users finally felt comfortable navigating and customizing dashboards without relying on exports.
What We Learned
Users easily understood the new tab structure and clearer terminology
Plain-language labels reduced confusion and boosted confidence
Advanced tasks (like split axes) still required guidance, revealing where tooltips and education were needed

