Updated · 10 min read
Onboarding flows: signup to activated
Picture the week after a new user signs up. They poked around, maybe set something up, maybe not, and quietly disappeared on day three. Your onboarding sequence kept emailing them anyway — same checklist, same nudge, same friendly "getting started?" — long after the decision was made. That gap, between what the user did and what your program noticed, is where most onboarding programs lose the activation game. Here's the playbook to close it: define the activation event, ship a first-seven-days sequence that matches the urgency, coordinate with in-product so you stop repeating yourself, and stop the sequence the moment the job is done.

By Justin Williames
Founder, Orbit · 10+ years in lifecycle marketing
First, the question that breaks most programs: when is a user actually activated?
The discriminating power of the event is what matters. Pick a signal that splits your retention curves and commit to it.
Before anyone writes an email, define the activation event — the single specific action that empirically predicts a user will stick around. This is the step most programs skip, which is why they can't tell if the work is working. "Signed up" doesn't count. "Opened the app twice" doesn't count. A completed profile rarely counts either, because filling in a profile field is what users do when they're tidying up; it's not what users do when they've fallen in love with the product.
What an activation event is, concretely: a threshold pulled from your own data, tuned so users who cross it retain at two to three times the rate of users who don't. In a messaging product it might be a count of messages sent in the first month. In a subscription product, reaching a specific feature or workflow. In a marketplace, a second transaction. The number itself doesn't matter — the discriminating power does. Pick something behavioural (the user did a thing), measurable (you can count it per user), and predictive (retention curves visibly split either side of it).
How do you find it? Pull retention curves — the line showing what percentage of a signup cohort is still active each week — for users who took various actions in week one. Find the action where the line for users-who-did stays well above the line for users-who-didn't, and stays there. That divergence is your signal. It will almost always be more specific than "logged in twice".
A representative activation funnel, showing the drop-off from signup down to activated:
A typical activation funnel The drop-off between signup and true activation is usually larger than teams expect. Most onboarding programs spend their energy on the top of this funnel where drop-off is already lowest. The real opportunity sits in the middle.
The first seven days are the whole game
Most activation happens in the first 72 hours or not at all. The sequence has to match that urgency. Spreading six emails across 30 days means half the audience has already made the call by the time email four lands — and the call was usually no. The 72-hour window guide has the intervention grid; for the sequence itself, here's a structure that works.
Hour 0 — Welcome with one specific action.Not "explore the product". Not "here's what we do". One clear CTA — call to action, the single button or link the email is asking the user to click — that moves them toward the activation event.
Day 1 — Value proof.Show the user what they're going to get, with a specific example. Don't ask for another action yet. Show value first, earn the next ask.
Day 2–3 — The second action. The next meaningful step. Branched by whether the user completed the hour-0 action or not.
Day 5 — Social proof or pattern.How other users are succeeding. Works when the users are named and specific. Reads as noise when it's generic.
Day 7 — Check-in. Friendly, low-pressure. This is the message that separates users still on the journey from users who quietly dropped out.
Four to six messages across seven to ten days is the sweet spot. Longer sequences hit diminishing returns hard — most activation already happened, and you're now emailing a cohort that's decided. The discipline isn't hitting every planned message. The discipline is stopping the sequence the moment activation fires.
On the content mix: early messages prompt action with minimal education. Users who just signed up have appetite, not patience. Middle-sequence messages can carry more "why this works" once the user has skin in the game. Pure education without a next step is usually the weakest email in the whole sequence.
Every message is conditional on the user NOT having activated yet. The moment the activation event fires, the sequence stops and the user moves to post-activation lifecycle — the longer-term retention program that runs once a user has stuck the landing. Continuing to send onboarding emails to activated users is how you train them to ignore everything else.
Why the same nudge in three channels is worse than one
Email is one channel. In-product tooltips, checklist widgets, and empty-state prompts are another. Push — short notifications fired to the user's phone or desktop — is a third. If these three don't coordinate, you send the user the same reminder four times and create friction instead of progress. The single most common onboarding mistake isn't a bad email. It's sending the same messages regardless of what the user just did in-product. An email saying "haven't set up your first project?" when the user set one up yesterday is worse than no email at all — it tells them, plainly, that you're not paying attention.
Coordination looks like this. The in-product checklist owns the canonical state of what's been completed — the single source of truth every other channel reads from. The email sequence reads that same state and adapts. Push fills the gaps where email timing is wrong: overnight pushes don't work, overnight emails often do. And no channel fires its "you haven't done X yet" message if another channel just fired the same reminder within the last N hours. One nudge per topic per day, across all surfaces, is the bar.
The Orbit Multi-Channel Orchestration skill handles exactly this — channel selection, frequency governance, and the adaptive sequencing that lets onboarding programs behave like one coherent experience instead of three that happen to share a user ID.
Asking for data the right way: progressive profiling
The data you collect at signup is usually the minimum viable: email, maybe a password, maybe a name. That's right, because asking for more at signup increases drop-off. But you still need the extra data — industry, team size, use case — to personalise the rest of the lifecycle. Progressive profiling is how you get it: the practice of collecting data in small chunks across many touchpoints rather than all at once on a signup form.
The practical rule: tie each data request to a specific piece of product value the user is about to receive. "What industry are you in?" alongside a feature that genuinely changes by industry. "What's your team size?" alongside the pricing page. Never a generic profile-completion email with a form full of demographic questions. It converts poorly. It also signals to the user that you're doing data collection, not personalisation — and they can tell the difference.
Knowing when to shut the sequence down
A sequence completes when one of three things happens. The user activates, in which case they graduate to the post-activation lifecycle. The user reaches the explicit end without activating — in which case they transition to the non-activated-user lifecycle stage, a slower-paced re-engagement program. Or the user hits a hard negative signal: marked it as spam, unsubscribed, hasn't opened anything in N days.
The third condition is the one most often missed. A user who hasn't opened any message in the sequence is telling you they don't want these emails. Keep sending and you damage sender reputation — the trust score mailbox providers (Gmail, Outlook, Yahoo) assign to your sending domain, which decides whether you land in the inbox or in spam (see the Deliverability Management skill for the mechanism) — in exchange for a tiny incremental activation rate. Cut the sequence short when the engagement signal is clearly negative. The trade is bad. Stop trading.
How do you measure the whole thing beyond opens and clicks? Primary metric: activation rate within the measurement window — the percentage of signups who hit the activation event within N days. Secondary: 30-day retention of activated users versus non-activated. The onboarding programs that earn their budget improve both. They lift activation AND the downstream retention of the newly-activated cohort. Shift one without the other and you're usually just moving users into a stage they weren't ready for — which shows up as a retention dip a month later, after everyone's moved on to celebrating the activation lift.
Read to the end
Scroll to the bottom of the guide — we'll tick it on your reading path automatically.
This guide is backed by an Orbit skill
Related guides
Browse allThe welcome email sequence: the 7-day shape that actually moves new signups
Most welcome sequences over-pitch, under-onboard, and keep firing long after the user has either started using the product or wandered off. Here's the 7-day shape that gets new signups to a real first action — shorter, sharper, conditional on what they actually did — plus the stop rules that keep it from training people to ignore your email.
The first 72 hours decide who activates
Activation isn't a seven-day project. It's a 72-hour race most teams lose without noticing. Why the window is that short, what to watch inside it, and how to intervene while users are still reachable.
Abandoned cart emails: what actually works
Cart abandonment is the easiest program to get wrong because the defaults work well enough to hide the problem. Here's the structure that actually moves incremental revenue — timing, sequencing, and the discount policy most teams have backwards.
Post-purchase emails: what to send after the receipt
Post-purchase is the highest-engagement window in the entire customer relationship and most lifecycle programs spend it sending a receipt, a generic welcome, and then silence. Here's the 30-day sequence that actually earns the second purchase.
Win-back flows: 12 patterns that earn their place
Win-back is the highest-ROI program most lifecycle teams underbuild. Twelve patterns that work, when each one fits, and the sunset policy that stops the program quietly eating your sender reputation.
Transactional emails: the highest-engagement messages you ignore
Order confirmations, password resets, receipts, shipping updates. Transactional emails post open rates two to three times higher than marketing sends — and most lifecycle teams have never touched them. Effort is going to the wrong place.
Found this useful? Share it with your team.
Use this in Claude
Run this methodology inside your Claude sessions.
Orbit turns every guide on this site into an executable Claude skill — 63 lifecycle methodologies, 91 MCP tools, native Braze integration. Free for everyone.