Data Intelligence Desk
Why are subscribers cancelling, and what can we do in the first 7 days to prevent it?
Investigation Mission
This investigation was commissioned to answer one core question with three specific sub-investigations. Every hypothesis, query, and finding maps back to these.
Subscriber churn is the single biggest drag on MRR growth. Leadership needs to know where the leaks are, why they happen, and which interventions have the highest ROI within the critical early window.
We suspect subscribers drop off early. We need to know the exact day(s) of maximum attrition so we can time interventions precisely.
When users initiate cancellation, the product presents a "reevaluate" save offer. We need to know what percentage interact, what percentage are saved, and whether the flow is worth investing in.
If engaged users churn less, we need to know if nudging content causes retention — or if committed users simply do both. This determines whether a content-push intervention is worth building.
Pipeline
6 hypotheses tested. 5 findings certified. 15 numbers verified. Full pipeline completed.
Hypothesis Ledger
| ID | Hypothesis | Verdict | Key Evidence |
|---|---|---|---|
| H1 | Most cancellers never engage with core content | Partially Confirmed | 66.5% never complete a quest; content-engaged cancel rate = 1.17% (150 / 12,819) |
| H2 | Retention cliff exists in first 3 days | Confirmed | 42.0 percentage point drop D0 to D1-3, then stable through D30 (variance 1.9 percentage points) |
| H3 | Cancellation save flow is ineffective (<10% interact) | Confirmed | 2.36% interact (387 / 16,366); net save rate 0.26% (43 / 16,366) |
| H4 | Annual subs disproportionately cancel | Rejected | Annual = 79.3% of cancels vs 80.2% of new subs (proportional) |
| H5 | Net subscriber growth has turned negative | Confirmed | 3 consecutive weeks negative; acquisition -91.2%, churn -59.9% |
| H6 | Content engagement is protective against churn | Inconclusive | 1.17% cancel rate among engaged, but no direct comparison group |
Executive Summary
Decision implication: The two highest-confidence interventions are (1) redesigning the cancellation save flow (currently near-zero effectiveness) and (2) building a D0-72h activation sequence targeting the retention cliff. Both address structural product gaps rather than user behavior.
Finding #1 -- Highest Confidence
Of 16,366 users who initiated cancellation in the past 90 days, only 387 (2.36%) interacted with the reevaluation/save offer. Of those 387, 344 (88.9%) still completed their cancellation. Net saves: 43 users out of 16,366 -- a 0.26% save rate.
Median time from cancellation_initiated to Cancellation Completed: ~31 seconds. Users are completing the entire cancellation flow — including the save offer — in half a minute. The reevaluation step presents zero meaningful friction. Users either don't see it or dismiss it instantly.
payment_frequency is not populated on cancellation events. Acquisition channel: [AppsFlyer] channel is empty for 99.6% of users. Plan tier: gp:plan is entirely empty. We cannot rule these out as confounders — they need better instrumentation.
Verdict: No confounders detected among testable variables. The save flow is broken across every geography and for all self-initiated cancellations. Two potential confounders (subscription type, acquisition channel) could not be tested due to missing instrumentation — these are flagged as data gaps, not evidence against the finding.
Redesign the cancellation save experience. Make the reevaluation step unmissable. Personalize the save offer. Test alternative offers: subscription pause, plan downgrade, curated content recommendations. At 0.26%, the current save rate is near-zero — any measurable improvement represents recovered subscribers. Run an A/B test to establish the achievable rate before projecting impact.
Finding #2 -- High Confidence
Bracket retention analysis of 38,176 new subscribers shows a dramatic 42 percentage point drop in the first 1-3 days after subscription start. Retention then stabilizes: D3-7 (59.4%), D7-14 (60.1%), D14-30 (61.3%) -- a total range of just 1.9 percentage points over the remaining 27 days.
| Bracket | Retained | Of | Rate | Drop from D0 |
|---|---|---|---|---|
| Day 0 | 38,176 | 38,176 | 100.0% | -- |
| D0-1 | 38,176 | 38,176 | 100.0% | — |
| D1-3 | 21,998 | 37,903 | 58.0% | -42.0% |
| D3-7 | 22,179 | 37,318 | 59.4% | -40.6% |
| D7-14 | 21,840 | 36,312 | 60.1% | -39.9% |
| D14-30 | 20,315 | 33,162 | 61.3% | -38.7% |
payment_frequency not populated. Acquisition channel: empty for 99.6% of users. Annual subscribers may have lower cliff severity (higher commitment), but we cannot verify.
Verdict: No confounders detected across 4 tested variables. The D1-3 cliff (42 percentage point drop) exists on every platform, in every geography, and is nearly identical for US vs Non-US users. The cliff is structural, not driven by any measured third variable.
Users who are active in the D1-3 window have 58-61% retention through D30. D1-3 activity is the single strongest early predictor of subscriber health identified in this investigation.
Build a "first 72 hours" activation intervention: guided content start, push notifications, personalized quest recommendation. Target: drive users to their first quest_asset_completed within 72 hours. If 10% of cliff-dropoffs are recovered, that is approximately 535 additional retained subscribers per month. Revenue impact requires LTV data (not measured).
Finding #3 -- Moderate Confidence
The last 3 consecutive weeks show net subscriber loss. New subscriptions fell from 13,707 (week of Nov 24) to 1,203 (week of Feb 16) -- a 91.2% decline. Cancellations declined from 3,894 to 1,562 (59.9% decline). Acquisition is collapsing faster than churn is declining.
| Week | Subs Started | Cancels | Net |
|---|---|---|---|
| Nov 24 | 13,707 | 3,894 | +9,813 |
| Dec 1 | 7,875 | 2,940 | +4,935 |
| Dec 8 | 2,095 | 2,397 | -302 |
| Dec 15 | 1,486 | 2,210 | -724 |
| Dec 22 | 1,671 | 1,524 | +147 |
| Dec 29 | 4,013 | 2,166 | +1,847 |
| Jan 5 | 2,691 | 2,026 | +665 |
| Jan 12 | 2,150 | 1,964 | +186 |
| Jan 19 | 1,898 | 1,712 | +186 |
| Jan 26 | 1,834 | 1,612 | +222 |
| Feb 2 | 1,354 | 1,679 | -325 |
| Feb 9 | 1,193 | 1,650 | -457 |
| Feb 16 | 1,203 | 1,562 | -359 |
Seasonal confound: The Nov 24 spike (13,707) is almost certainly a Black Friday / holiday promotion. Measuring "91.2% decline" from a promotional peak is misleading. The true organic baseline may be 1,200-2,000 per week (Jan-Feb average), making the current rate normal rather than alarming. Year-over-year data is required to determine if this is a crisis or seasonal normalization.
This finding requires investigation, not intervention. Pull year-over-year acquisition data to establish the true baseline. Audit paid vs organic channel performance. If the Feb rate (~1,200/week) is the organic baseline and churn holds at ~1,600/week, the subscriber base will structurally shrink without sustained paid acquisition.
Finding #4 -- Requires Experiment
Of 38,330 new subscribers in 90 days, only 12,819 (33.5%) completed at least one quest asset within 30 days. Among content-engaged subscribers, the 30-day cancel rate is 1.17% (150 / 12,819). The remaining 66.5% of subscribers had zero quest completions.
Verdict: CONFOUNDED. Two genuine confounders detected (geography, platform). The content→retention relationship cannot be claimed as causal from observational data. The next step is to determine where the relationship holds: create Amplitude cohorts for (a) US + content-engaged vs US + non-engaged, (b) same for Non-US, and (c) same by platform. If the relationship survives within each stratum, it strengthens the causal case.
Do NOT redesign content strategy based on this finding alone. Run a controlled A/B experiment: randomize new subscribers into an aggressive content nudge group vs control. Alternatively, create the multivariate cohorts described above to see if content→retention holds within US users and within each platform. Only if it survives within-stratum analysis should you invest in content-driven retention interventions.
Rejected Hypothesis
Annual subscribers account for 79.3% of cancellations (38,588 / 48,678) and 80.2% of new subscriptions (51,709 / 64,473). The cancellation share is proportional to the subscriber mix. Monthly subscribers are slightly overrepresented in cancellations (19.2% vs 17.3% of new subs).
Caveat: This comparison uses 90-day new subscriptions as a proxy for the subscriber base. The actual subscriber base accumulated over years and may have a different plan mix. Finding is directionally valid but denominator is imprecise.
Context: Geographic Distribution
| Country | Cancellations (90d) | Share |
|---|---|---|
| United States | 16,598 | ~51% |
| United Kingdom | 4,032 | ~12% |
| Canada | 3,386 | ~10% |
| Australia | 2,559 | ~8% |
| Germany | 1,401 | ~4% |
| Mexico | 1,310 | ~4% |
| Spain | 857 | ~3% |
| India | 842 | ~3% |
| Colombia | 749 | ~2% |
English-speaking countries (US, UK, CA, AU) account for approximately 82% of all cancellations. Without subscriber base proportions by country, it is not possible to determine if any geography has a disproportionately high churn rate.
What To Do Next
Ranked by confidence level and expected impact. Actions 1-2 are ready to execute. Actions 3-4 require prerequisite investigation.
quest_asset_completed event within 72h as leading indicator
Sequencing note: Actions 1 & 2 can run in parallel immediately. Action 3 should start now but is informational (no product change). Action 4 depends on Action 2's results — if the activation nudge test (Action 2) shows causal impact on retention, Action 4 becomes lower priority.
Verification Ledger (6-Category Protocol)
Category 0 (Source Provenance) runs first. Arithmetic is meaningless on unverified inputs.
The top 3 numbers leadership would quote were re-queried in Amplitude on Feb 23, 2026 to confirm data hadn't shifted.
| Claim | Original Query | Re-Query editId | Result |
|---|---|---|---|
| 33.5% content engagement (12,819 / 38,330) | H1-H6: Activation Funnel | 020xcbr6 |
CONFIRMED -- 38,330 / 12,819 / 150 all match |
| 0.26% save rate (43 / 16,366) | H3: Cancellation Save Funnel | 9orwx9k8 |
CONFIRMED -- 16,366 / 387 / 344 all match |
| 91.2% acquisition decline (13,707 to 1,203) | H5: Net Subscriber Flow | iqnp9fi4 |
CONFIRMED -- all 26 weekly data points match |
F001 -- Content Engagement Gap
F002 -- Retention Cliff
F003 -- Save Flow Failure
F004 -- Payment Frequency
F005 -- Net Subscriber Flow
F006 -- Geographic Distribution
Verification Summary
Source Provenance: 15/15 core numbers traced to raw Amplitude query outputs. 3 critical claims independently re-queried and confirmed.
Arithmetic: 9/9 derived calculations verified. 2 within rounding tolerance (0.1 percentage points).
Consistency: 4/4 cross-checks passed. 154-user difference between funnel and retention cohort explained by query structure.
Fabrication: 0 fabricated inputs. No revenue or LTV estimates. 1 approximate figure noted (~82% geography).
Discrepancies found: 2 minor -- F004 total off by 4 events (0.008%), F001 engagement rate 33.4% vs 33.5% (0.1 percentage points rounding).
Double-counting risk: 1 overlap flagged (F001 and F003 may share users in cancellation pipeline). Recommendations should not sum their impacts.
Query Log
| # | Query Name | Type | Hypothesis |
|---|---|---|---|
| 1 | H1-H6: Activation Funnel (Sub > Quest > Cancel, 30d) | Funnels | H1, H6 |
| 2 | H4-denominator: New Subs by Payment Frequency | Segmentation | H4 |
| 3 | H1-context: Activation Funnel (Sub > Quest > RENEWAL, 30d) | Funnels | H1 |
| 4 | H2-context: Cancellation by Country | Segmentation | Context |
| 5 | H5: Net Subscriber Flow (Weekly, 90d) | Segmentation | H5 |
| 6 | H3: Cancellation Funnel (Initiated > Reevaluated > Completed) | Funnels | H3 |
| 7 | H2: Subscriber Retention Brackets (After Sub Started) | Retention | H2 |
| 8 | Event Validation Search (8 events) | Search | All |
| 9 | Context: Amplitude Project/Org Info | Search | All |
Prevention Rules Applied
| Rule | Description | Applied In |
|---|---|---|
| RULE-001 | Show intermediate steps for sums of >5 values | F004, F005 calculations |
| RULE-002 | Every rate shows X / Y = Z% with labeled denominator | All findings |
| RULE-003 | Use actual ranges, not aspirational rounding | Retention D3-D30 range |
| RULE-004 | Population in wording must match denominator | All rate claims |
| RULE-005 | Min/max computed from data, not memory | All ranges |