Pulse AI UX Intelligence Dashboard

Pulse — AI UX Intelligence Dashboard

Designers get dashboards built for PMs. I built one built for us.

Pulse is an AI UX Intelligence Dashboard that X-rays live pages, pins AI-detected issues directly onto the UI, and tells you what to fix in language designers actually think in. It translates behavioral data into designer vocabulary: component health scores, AI diagnosis citing UX principles, and implementation-ready fixes. Pulse is a self-initiated concept — I built it because at LexisNexis I watched designers drown in analytics dashboards that spoke PM language, and I wanted to see what an alternative would look like.

Every UX analytics tool speaks the wrong language.

Hotjar shows you heatmaps but calls the problems "bounce rates." FullStory replays sessions but buries insights behind enterprise search queries. Contentsquare quantifies frustration but ties it to revenue, not to the component that's broken. They're built for product managers tracking funnels, not designers trying to figure out why the onboarding form feels wrong.

The gap isn't data. Designers are drowning in data. The gap is translation. A designer doesn't need to know that "bounce rate increased 12% on /onboarding." They need to know that users are clicking the submit button 3.2 times because the disabled state looks identical to the enabled state — and that it violates WCAG 1.4.3.

I wanted to build the analytics tool I'd actually open every morning. One that shows me the interface, shows me where it hurts, and tells me what to do about it — all in the vocabulary I already use.

Pulse dashboard with command bar and metrics overview

Designers don't think in pages. They think in components.

A button, a form, a modal, a date picker — that's the unit of work. But no analytics tool maps issues to components. They stop at page-level metrics and leave you to figure out which piece of the UI is actually responsible.

The second realization: showing charts about a UI is fundamentally less useful than showing the UI with the problems on it. A bar chart that says "Onboarding: 69/100" communicates almost nothing. An SVG wireframe of the onboarding page with red pulsing dots on the submit button and the date picker — that communicates everything in a glance.

That's the core of Pulse: make the invisible visible, at the level designers actually work.

Page X-Ray: see the shape of the problem before reading a single word.

Page X-Ray is the centerpiece. Every page in the product being analyzed gets a procedural SVG wireframe — not a screenshot, but an architectural blueprint rendered in code. Issues detected by the AI are overlaid as interactive hotspots pinned to the exact UI component where the problem lives. Toggle to heatmap mode and the wireframe lights up with a thermal visualization — red where friction concentrates, cooling to yellow at the edges.

Pulse issues card view with severity indicators

AI diagnosis that shows its work.

Click any hotspot and you're in the Issue Deep-Dive — the AI showcase. At the top: a zoomed wireframe cropped to the problem area. Below: behavioral evidence (rage clicks, dead clicks, u-turns), then the AI diagnosis. The diagnosis doesn't just describe the problem — it names the UX principle being violated, cites the WCAG criteria if applicable, and rates its own confidence level. Recommendations are numbered, tagged by effort level, and each one is specific enough to act on.

AI diagnosis with UX principle reference and confidence badge

From diagnostic to design co-pilot.

The deepest layer: Generate Fix. For issues with clear token-level solutions, Pulse generates implementation-ready code — current values versus recommended values, with WCAG compliance verification. Copy to Figma, export as tokens. This is where the tool crosses from "diagnostic" to "design co-pilot."

Component Health maps every UI component in the system — 32 in total — with individual health scores, issue counts, instance counts, and 7-day trends. Sorted worst-first by default, because the components that need attention should be impossible to ignore. This is the systems thinking view — not "which page is broken" but "which building block is failing across the product."

Component health grid sorted by worst health

The X-Ray pivot changed everything.

The first version was a standard metrics layout — health score ring, trend charts, issue cards in a grid. Functional, accurate, and completely forgettable. Expert feedback was blunt: "Not bad, but not interesting. You sell it as AI-driven and I don't see anything impressive." That led to the Page X-Ray concept — show the actual UI being analyzed with issues overlaid on it. The charts got demoted to a compact secondary row. The wireframe became the hero. That single decision transformed the project from a well-built template into something that makes people stop and look.

Designer vocabulary, everywhere. Every label, every metric, every diagnosis is written in the language designers use. "Users are stuck" instead of "bounce rate increased." "This component is the problem" instead of "conversion dropped." The issues table uses "Frustration Signals" as a column header, not "Error Events." This isn't cosmetic — it determines whether a designer trusts the tool enough to use it daily.

Component-level health over page-level scores. Every major competitor tracks health at the page level. None map issues to individual components. But designers work in components. The decision to build a component health map with 32 tracked elements, each with its own score, trend, and issue list, creates a view that no existing tool offers.

Page X-Ray with interactive hotspots overlay

A concept that earned its place.

Pulse is a concept, not a shipped product. But the decisions inside it are real, and the craft is real. Here's what I built:

A component-level UX health system — 32 components tracked individually, each with its own score, trend line, and issue log. No existing commercial tool maps issues to components at this resolution.

Six procedural SVG wireframes that render in both light and dark mode through a single token system. Every color, spacing rule, and type scale is CSS variable-driven.

18 AI diagnoses written in the vocabulary designers actually use, each citing a specific UX principle (Fitts's Law, progressive disclosure, WCAG 1.4.3) with confidence ratings and effort tags.

A Page X-Ray interaction that emerged from a hard pivot away from a conventional dashboard layout — the charts got demoted, the wireframe became the hero, and the whole project stopped being forgettable.

What I'd do differently. I spent too long polishing the monitoring dashboard before accepting that the monitoring dashboard wasn't the interesting part. Expert feedback — "Not bad, but not interesting" — saved the project. If I were running it again I'd start by mocking the most ambitious interaction first, not the most conventional one. The X-Ray would have been day one, not week three.

Built with: Next.js 14, Tailwind CSS v4, Recharts, Framer Motion, Claude Code. No backend, no real AI calls, but every interaction works end to end.

Other Work