<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Alex Koval</title>
    <description>The latest articles on Forem by Alex Koval (@alexxora).</description>
    <link>https://forem.com/alexxora</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/alexxora"/>
    <language>en</language>
    <item>
      <title>I built an analytics tool that tells you what's broken instead of showing more charts</title>
      <dc:creator>Alex Koval</dc:creator>
      <pubDate>Thu, 19 Feb 2026 21:04:13 +0000</pubDate>
      <link>https://forem.com/alexxora/i-built-an-analytics-tool-that-tells-you-whats-broken-instead-of-showing-more-charts-2g0b</link>
      <guid>https://forem.com/alexxora/i-built-an-analytics-tool-that-tells-you-whats-broken-instead-of-showing-more-charts-2g0b</guid>
      <description>&lt;p&gt;Six months ago I had a Mixpanel setup with 140+ custom events. Nobody on the team knew what half of them tracked. The engineer who set them up had left. The PM spent two weeks building dashboards before making any decision.&lt;/p&gt;

&lt;p&gt;And after all that work, the dashboards told us &lt;em&gt;what&lt;/em&gt; happened. Never &lt;em&gt;why&lt;/em&gt;. Never &lt;em&gt;what to do next&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;So I started building something different.&lt;/p&gt;

&lt;h2&gt;
  
  
  The problem I kept running into
&lt;/h2&gt;

&lt;p&gt;Every analytics tool I've used follows the same pattern: you track events, you build funnels, you stare at charts, you try to figure out what they mean.&lt;/p&gt;

&lt;p&gt;The interpretation part — the "so what?" — is always left to you. And if you're a small team shipping fast, you don't have time to be a data analyst on top of everything else.&lt;/p&gt;

&lt;p&gt;I wanted a tool that skips the dashboard phase entirely and just says: &lt;strong&gt;"Feature X has a drop-off problem at step 3. Here's what's likely causing it."&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How it works under the hood
&lt;/h2&gt;

&lt;p&gt;The core idea is simple. Instead of tracking hundreds of discrete events and hoping someone connects the dots, we track &lt;strong&gt;behavioral sequences&lt;/strong&gt; and look for pattern changes over time.&lt;/p&gt;

&lt;p&gt;Here's the basic mental model:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Traditional analytics: track everything, interpret later&lt;/span&gt;
&lt;span class="nf"&gt;track&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;button_clicked&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;button&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;signup&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="nf"&gt;track&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;page_viewed&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;page&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/onboarding&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="nf"&gt;track&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;feature_used&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;feature&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;import&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="c1"&gt;// ...140 more of these&lt;/span&gt;

&lt;span class="c1"&gt;// What we do: drop in one script, auto-tracking handles the rest&lt;/span&gt;
&lt;span class="c1"&gt;// &amp;lt;script src="https://xora.es/sdk.js"&amp;gt;&amp;lt;/script&amp;gt;&lt;/span&gt;
&lt;span class="nx"&gt;xora&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;init&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;projectId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;YOUR_PROJECT_ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;YOUR_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="c1"&gt;// Auto-tracking enabled: pageviews, clicks, forms&lt;/span&gt;
&lt;span class="c1"&gt;// The system builds behavioral sequences from there&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The SDK auto-captures session-level behavioral data — pageviews, clicks, form interactions — out of the box. On the backend, we build per-user behavioral profiles and track three things:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Time-to-value trends&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Not just "how long until first action" but how that time changes over weeks. If a user's path to value is getting longer, that's friction building up — and it usually predicts churn 3-6 weeks before it happens.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Feature breadth per session&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Healthy users explore. They touch multiple features, check settings, try integrations. Users who are about to leave narrow down to single-feature, in-and-out sessions. We track this as a rolling average and flag anomalies.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Post-support behavioral change&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When a user contacts support and then &lt;em&gt;doesn't&lt;/em&gt; change their behavior afterward, that's a strong signal they've mentally checked out. We track behavior deltas within 48 hours of support interactions.&lt;/p&gt;

&lt;h2&gt;
  
  
  The architecture (simplified)
&lt;/h2&gt;

&lt;p&gt;┌─────────────┐     ┌──────────────┐     ┌─────────────────┐&lt;br&gt;
│  JS SDK      │────▶│  Event       │────▶│  Sequence        │&lt;br&gt;
│  (auto-track)│     │  Pipeline    │     │  Builder         │&lt;br&gt;
└─────────────┘     └──────────────┘     └────────┬────────┘&lt;br&gt;
│&lt;br&gt;
▼&lt;br&gt;
┌─────────────────┐&lt;br&gt;
│  Pattern         │&lt;br&gt;
│  Analyzer        │&lt;br&gt;
└────────┬────────┘&lt;br&gt;
│&lt;br&gt;
▼&lt;br&gt;
┌─────────────────┐&lt;br&gt;
│  Recommendations │&lt;br&gt;
│  Engine          │&lt;br&gt;
└─────────────────┘&lt;/p&gt;

&lt;p&gt;The Sequence Builder turns raw events into behavioral flows per user. The Pattern Analyzer compares current patterns against that user's historical baseline. When something shifts beyond a threshold, the Recommendations Engine generates a plain-English explanation of what changed and what likely caused it.&lt;/p&gt;

&lt;p&gt;No dashboards. No funnels to build. Just alerts that say "here's what's broken and here's what to fix first."&lt;/p&gt;
&lt;h2&gt;
  
  
  What I learned building this
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Devs don't want more dashboards.&lt;/strong&gt; Every founder and dev I talked to during early user research said the same thing: "I don't have time to interpret charts." They wanted answers, not data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Auto-tracking beats manual event setup.&lt;/strong&gt; We started with a manual event tracking approach like everyone else. Adoption was terrible. Nobody wants to instrument 50 events. The moment we switched to auto-capture with a lightweight script tag, onboarding time dropped from days to minutes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Behavioral sequences &amp;gt; individual events.&lt;/strong&gt; A single event tells you almost nothing. The order and timing of events tells you everything. This was the biggest insight that shaped the product.&lt;/p&gt;
&lt;h2&gt;
  
  
  Current state
&lt;/h2&gt;

&lt;p&gt;We're in early beta with a handful of teams. Still rough around the edges. The SDK is lightweight, drops into any site with a single script tag. Integration is copy-paste:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="c"&gt;&amp;lt;!-- Add before &amp;lt;/head&amp;gt; --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;script &lt;/span&gt;&lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"https://xora.es/sdk.js"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/script&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;script&amp;gt;&lt;/span&gt;
  &lt;span class="nx"&gt;xora&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;init&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;projectId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;YOUR_PROJECT_ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;YOUR_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/script&amp;gt;&lt;/span&gt;

&lt;span class="c"&gt;&amp;lt;!-- Identify logged-in users --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;script&amp;gt;&lt;/span&gt;
  &lt;span class="nx"&gt;xora&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;identify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user_123&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;$email&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;user@example.com&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;plan&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;pro&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="c1"&gt;// Track custom events on top of auto-tracking&lt;/span&gt;
  &lt;span class="nx"&gt;xora&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;track&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;feature_used&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;export&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/script&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Auto-tracking covers pageviews, clicks, and forms out of the box. You only need &lt;code&gt;identify&lt;/code&gt; and &lt;code&gt;track&lt;/code&gt; if you want to layer on user identity and custom events.&lt;/p&gt;

&lt;p&gt;If you're building a SaaS product and tired of staring at Mixpanel dashboards without knowing what to actually do — I'd love to hear what you think. Still figuring out a lot of this in public.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What's your current analytics setup look like? And honestly — do you actually look at your dashboards?&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>startup</category>
      <category>javascript</category>
      <category>analytics</category>
    </item>
    <item>
      <title>How to detect where users drop off with 5 lines of code (no Mixpanel, no Amplitude)</title>
      <dc:creator>Alex Koval</dc:creator>
      <pubDate>Fri, 13 Feb 2026 11:00:45 +0000</pubDate>
      <link>https://forem.com/alexxora/how-to-detect-where-users-drop-off-with-5-lines-of-code-no-mixpanel-no-amplitude-39n0</link>
      <guid>https://forem.com/alexxora/how-to-detect-where-users-drop-off-with-5-lines-of-code-no-mixpanel-no-amplitude-39n0</guid>
      <description>&lt;p&gt;Most analytics tools are overkill for early-stage products. You don't need 50 dashboards. You need to know one thing: &lt;strong&gt;where are users dropping off?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I've been building an analytics tool and along the way figured out patterns that work for detecting user drop-offs early. Here's the practical breakdown.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fth80e68gh72bakpoyyfo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fth80e68gh72bakpoyyfo.png" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The problem with traditional analytics setup
&lt;/h2&gt;

&lt;p&gt;You install Mixpanel or Amplitude. You track 200 events. You build funnels. You stare at charts. Two weeks later you still don't know why users leave after the onboarding screen.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The issue isn't tracking. It's interpretation.&lt;/strong&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  What actually matters: the critical path
&lt;/h2&gt;

&lt;p&gt;Before you track anything, map your &lt;strong&gt;critical path&lt;/strong&gt;. That's the 3-5 actions a user MUST complete to get value from your product.&lt;/p&gt;

&lt;p&gt;For example:&lt;br&gt;
Sign up -&amp;gt; Connect data source -&amp;gt; See first insight -&amp;gt; Invite teammate&lt;/p&gt;

&lt;p&gt;That's it. Everything else is noise at this stage.&lt;/p&gt;


&lt;h2&gt;
  
  
  Lightweight tracking that tells you what's broken
&lt;/h2&gt;

&lt;p&gt;Here's a minimal approach. Track only your critical path steps with timestamps:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Track critical path steps with timing&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;trackStep&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;step&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;metadata&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;payload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;step&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="na"&gt;sessionId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;getSessionId&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="na"&gt;timeSinceLastStep&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;getTimeSinceLastStep&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="nx"&gt;metadata&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="nb"&gt;navigator&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sendBeacon&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/api/track&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;payload&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// Usage&lt;/span&gt;
&lt;span class="nf"&gt;trackStep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;signup_complete&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nf"&gt;trackStep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;data_source_connected&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;source&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;postgres&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="nf"&gt;trackStep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;first_insight_seen&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;insightType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;drop_off&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="nf"&gt;trackStep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;teammate_invited&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  The metric nobody tracks: time between steps
&lt;/h2&gt;

&lt;p&gt;Drop-off rate between steps is obvious. Everyone tracks that. But &lt;strong&gt;time between steps&lt;/strong&gt; is where the real signal is.&lt;/p&gt;

&lt;p&gt;If 80% of users complete Step 1 → Step 2, but it takes them &lt;strong&gt;15 minutes&lt;/strong&gt; on average when it should take 2, you have a UX problem that won't show up in a conversion funnel.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Simple analysis: find where users get stuck&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;analyzeDropoffs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;events&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;steps&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;signup&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;connect_source&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;first_insight&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;invite&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;analysis&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;steps&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;step&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;prevStepUsers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;events&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;step&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="nx"&gt;steps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;i&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;thisStepUsers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;events&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;step&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="nx"&gt;step&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;conversionRate&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;thisStepUsers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="nx"&gt;prevStepUsers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Calculate median time between steps&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;thisStepUsers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;curr&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;prevStepUsers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;p&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;sessionId&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="nx"&gt;curr&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;sessionId&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;prev&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;curr&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;timestamp&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;prev&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;timestamp&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Boolean&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;medianTime&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sort&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;a&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="nx"&gt;b&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;a&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;b&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;floor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;times&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)];&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;transition&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;steps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;i&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]}&lt;/span&gt;&lt;span class="s2"&gt; -&amp;gt; &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;step&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;conversionRate&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;conversionRate&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;toFixed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;%&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;medianTimeSeconds&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;medianTime&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;toFixed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Boolean&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;analysis&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output looks like this:&lt;br&gt;
signup -&amp;gt; connect_source:        62.3%  |  median: 340s ⚠️&lt;br&gt;
connect_source -&amp;gt; first_insight: 89.1%  |  median: 12s  ✓&lt;br&gt;
first_insight -&amp;gt; invite:         23.7%  |  median: 890s ⚠️&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Now you know exactly where to focus. No dashboards needed.&lt;/strong&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Three patterns that predict churn
&lt;/h2&gt;

&lt;p&gt;From working with beta users, these are the signals that consistently predict whether someone will stick around:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Time-to-first-value over 5 minutes = danger zone.&lt;/strong&gt;&lt;br&gt;
If a user can't get value from your product in the first 5 minutes, the chance they come back drops off a cliff. Measure this obsessively.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. "Browse but never create" pattern.&lt;/strong&gt;&lt;br&gt;
Users who view 10+ pages but never perform a creation action (write, build, upload, connect) are tourists. They won't convert.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Solo usage after day 3.&lt;/strong&gt;&lt;br&gt;
For any collaborative product, if a user hasn't involved someone else by day 3, retention drops dramatically. The product becomes a solo experiment that gets forgotten.&lt;/p&gt;




&lt;h2&gt;
  
  
  When this DIY approach breaks down
&lt;/h2&gt;

&lt;p&gt;This works great for your first 100-1000 users. After that you start needing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Automatic segmentation&lt;/strong&gt; — paid vs churned behavior comparison&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI pattern detection&lt;/strong&gt; — finding signals you didn't think to look for&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Real-time recommendations&lt;/strong&gt; — "this user is about to churn, here's why"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is exactly what I'm building with &lt;a href="https://analytics.xora.es" rel="noopener noreferrer"&gt;Xora Analytics&lt;/a&gt;. Instead of more dashboards, the AI analyzes user behavior and tells you what's broken and what to fix. &lt;strong&gt;5 lines of SDK integration.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;But honestly, start with the simple approach above. You'd be surprised how much you learn from just tracking 4-5 critical steps.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;What's your critical path? And where do users get stuck? Would love to hear what patterns others have found.&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>ai</category>
      <category>analytics</category>
      <category>startup</category>
    </item>
    <item>
      <title>I replaced 140 custom analytics events with 5 lines of code. Here's what I learned</title>
      <dc:creator>Alex Koval</dc:creator>
      <pubDate>Wed, 11 Feb 2026 09:50:42 +0000</pubDate>
      <link>https://forem.com/alexxora/i-replaced-140-custom-analytics-events-with-5-lines-of-code-heres-what-i-learned-25nd</link>
      <guid>https://forem.com/alexxora/i-replaced-140-custom-analytics-events-with-5-lines-of-code-heres-what-i-learned-25nd</guid>
      <description>&lt;p&gt;Last year I was consulting for a B2B SaaS product. They had 140 custom Mixpanel events. One hundred and forty.&lt;br&gt;
Nobody knew what half of them tracked. The engineer who set most of them up had left. The product manager was building dashboards for two weeks before making any decisions. And after all that work? The dashboards told them what happened. Not why. Not what to do next.&lt;br&gt;
That experience broke something in my brain. I couldn't stop thinking about it.&lt;br&gt;
So I quit my job and built the thing I wished existed.&lt;br&gt;
The problem nobody talks about&lt;br&gt;
Every analytics tool on the market works the same way:&lt;/p&gt;

&lt;p&gt;You define events manually&lt;br&gt;
You send them to a dashboard&lt;br&gt;
You stare at charts&lt;br&gt;
You try to figure out what they mean&lt;br&gt;
You build another dashboard&lt;br&gt;
Repeat&lt;/p&gt;

&lt;p&gt;This is insane. We're in 2026. AI can generate photorealistic videos and write code. But product analytics still works like it's 2015.&lt;br&gt;
The dirty secret of Mixpanel, Amplitude, and every other analytics tool: they show you data, not decisions.&lt;br&gt;
You still need a data analyst. You still need weeks to get answers. You still need engineering time to instrument events. And by the time you act on the insight, your users already churned.&lt;br&gt;
What 4,000 events per week taught me&lt;br&gt;
I'm building Xora Analytics. Instead of asking teams to define 140 events, we ask them to care about 5 core metrics. The AI does the rest.&lt;br&gt;
Here's the architecture in plain English:&lt;/p&gt;

&lt;p&gt;You install a lightweight JS snippet (~2 min)&lt;br&gt;
It captures user behavior automatically&lt;br&gt;
AI analyzes patterns and finds where users drop off&lt;br&gt;
You get recommendations, not charts&lt;/p&gt;

&lt;p&gt;Not "your retention dropped 12% at week 2." Instead: "Users who skip the workspace setup step churn 3x more. Here's a suggested onboarding change with predicted impact."&lt;br&gt;
That's the difference. One gives you homework. The other gives you answers.&lt;br&gt;
The 5 patterns that actually matter&lt;br&gt;
After analyzing user behavior across our beta products, I keep seeing the same 5 churn patterns. I wrote about them in detail on Medium, but here's the quick version:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The 24-Hour Ghost
If a user doesn't complete a meaningful action in 24 hours, they're 3x more likely to churn in the first month. Your time-to-first-value needs to be under 10 minutes.&lt;/li&gt;
&lt;li&gt;The Feature Desert
Single-feature users have 34% retention at 90 days. Users with 3+ features: 89%. If someone only uses one thing, they will leave.&lt;/li&gt;
&lt;li&gt;The Week 2 Cliff
Day 8-14 is where retention dies. Week 1 is exploration. Week 2 needs a habit trigger. If you don't create one, they're gone.&lt;/li&gt;
&lt;li&gt;The Champion Exit
When the one person who loves your product changes roles or leaves, the whole account goes silent. Multi-user adoption is your insurance.&lt;/li&gt;
&lt;li&gt;The Quiet Downgrade
Usage drops 60-70% over 3 months but they stay subscribed. They're not loyal. They just forgot to cancel. A 50%+ drop in weekly usage over 4 weeks predicts cancellation within 60 days.
These aren't theoretical. These are real patterns from real products. And none of them require 140 custom events to detect.
The technical bit
For the devs who want to know how the integration looks:
javascript// That's it. That's the setup.
import { Xora } from '@xora/sdk';&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Xora.init({&lt;br&gt;
  apiKey: 'your-key',&lt;br&gt;
  autocapture: true&lt;br&gt;
});&lt;/p&gt;

&lt;p&gt;Xora.identify(userId, { plan: 'pro', role: 'admin' });&lt;br&gt;
Five lines. No event taxonomy to design. No engineering sprint to instrument. No 40-page tracking plan.&lt;br&gt;
The AI figures out what matters based on actual user behavior, not your assumptions about what they should be doing.&lt;br&gt;
What I'd do differently if starting analytics from scratch&lt;br&gt;
If you're an early-stage SaaS and you're about to set up analytics, here's my honest advice regardless of what tool you use:&lt;br&gt;
Don't start with events. Start with questions.&lt;br&gt;
Write down the 3 questions you need answered this month:&lt;/p&gt;

&lt;p&gt;Where do users drop off in onboarding?&lt;br&gt;
Which feature correlates with retention?&lt;br&gt;
What's the activation threshold?&lt;/p&gt;

&lt;p&gt;Then instrument ONLY what answers those questions. Everything else is noise.&lt;br&gt;
Don't build dashboards nobody checks.&lt;br&gt;
If your team doesn't look at a dashboard weekly, delete it. Dashboard proliferation is a disease. I've seen companies with 200+ dashboards and zero data-driven decisions.&lt;br&gt;
Don't separate analytics from action.&lt;br&gt;
Knowing that "Step 3 has a 42% drop-off" is useless if it takes 2 weeks to get engineering to fix it. The insight and the action need to be close together.&lt;br&gt;
Where I am now&lt;br&gt;
Xora is in early beta. We have a handful of products sending us around 4,000 events per week. It's not a lot. We're early.&lt;br&gt;
But the feedback is real. One team replaced their entire Mixpanel setup with our 5-minute integration and got their first actionable recommendation within 24 hours. No dashboards built. No data analyst hired. No engineering time wasted.&lt;br&gt;
The free tier handles up to 500K events/month: analytics.xora.es&lt;br&gt;
If you're building a SaaS product and drowning in analytics tools that give you charts but not answers — I'd love to hear about your setup. What's working? What's broken? Drop a comment or hit me up.&lt;/p&gt;

&lt;p&gt;I'm Alex, founder of &lt;a href="https://analytics.xora.es/" rel="noopener noreferrer"&gt;Xora Analytics&lt;/a&gt;. I write about product analytics, churn patterns, and building SaaS products. Follow me here on DEV or connect on &lt;a href="https://www.linkedin.com/in/alex-koval-9659331b1/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt; if you want to nerd out about this stuff.&lt;/p&gt;

</description>
      <category>analytics</category>
      <category>saas</category>
      <category>startup</category>
      <category>javascript</category>
    </item>
  </channel>
</rss>
