In my years working with product teams, I've noticed something that separates the great from the merely good: understanding the difference between being data-informed and data-led.
It's a subtle distinction, but trust me – it can make or break your product strategy, especially when it comes to onboarding.
Understanding Goodhart's Law in onboarding metrics
Let's kick things off with a principle I wish someone had told me earlier in my career: Goodhart's Law.
Simply put, it states that “when a measure becomes a target, it ceases to be a good measure.”
So what does this actually mean for your product? Well, when you get too fixated on optimizing a specific metric, people find ways to hit that target – even if it ultimately hurts your product's success.
Here's an example: imagine you're obsessed with improving your your fitness app's onboarding completion rate. To boost those numbers, you start cutting out important steps like setting fitness goals or choosing dietary preferences.
Your completion numbers look fantastic on paper, but something’s missing… Are your users actually getting value from your product? Probably not.
That's Goodhart's Law in action – you've chased the metric at the expense of what actually matters: user experience.

What happens when you focus on the wrong metric?
In the early days of Dropbox, the organization set a goal of increasing the number of users who signed up for the service.
Their strategy was pretty clever on the surface – they offered free storage space to users who referred new customers. The numbers looked amazing, referrals poured in, and signups went through the roof.
It's still noted as a top-performing referral program:
But there was a huge problem lurking beneath those impressive stats: many of these new users weren't actually using the platform.
Think about it – have you ever signed up for something just because your friend was getting a discount? Did you become an engaged user, or did you sign up and forget about it? This is exactly what happened with Dropbox – they were celebrating the wrong win.
Now, in some cases, this could be a good strategy, but not in Dropbox’s case. This referral program was very successful, and Dropbox saw a significant increase in user signups, but users simply weren’t sticking around.
Eventually, Dropbox caught on to what was happening. This realization led Dropbox to re-evaluate its metrics and focus on measures of user engagement and revenue, rather than just user signups.
They rolled out features like Dropbox Paper (a collaborative document editing tool) and Dropbox Business, specifically designed to give people more reasons to engage with the product.
By pivoting from being data-led (blindly chasing signup numbers) to data-informed (using metrics as just one input alongside user needs), Dropbox managed to secure their position as a market leader.

Data-informed vs. data-led
So what's the real difference between being data-informed versus data-led?
When you're data-led, you're essentially letting the numbers call all the shots. While this seems logical (after all, numbers don't lie, right?), it often leads to short-term wins at the expense of long-term value.
Being data-informed, on the other hand, means using data as just one of several inputs in your decision-making process. You're not ignoring the numbers – you're just putting them in context with qualitative insights, user feedback, and your overall strategy.

Common pitfalls in metric selection (and how to avoid them)
After working with dozens of product teams, I've spotted some recurring traps that companies fall into when picking their metrics.
Let me share the two biggest ones you need to watch out for:
1. Misaligned metrics
The problem: You're tracking metrics that don't actually connect to your company's goals or what your users really need.
How to avoid it:
- Regularly check if your metrics still align with your company's direction – quarterly reviews work well for most teams.
- Ask yourself: “What are we really trying to achieve as a company, and does this product help get us there?”
- Consider: “Who are our actual users, and what do they need from us?”
- Use real user feedback to validate that you're measuring stuff that actually matters.
2. Limited scope of metrics
The problem: You're hyper-focused on one dimension of success while completely missing other crucial parts of the user experience.
How avoid it:
- Mix hard numbers with qualitative feedback.
- Connect immediate wins (like activation) with what really matters long-term (like retention).
- Talk to your users through interviews and surveys.
- Ask yourself: "What important parts of the user experience aren't showing up in our dashboards?"

Selecting the right metrics for your product onboarding
When it comes specifically to product-led onboarding, choosing the right metrics isn't rocket science, but it does require a clear process:
1. Define your onboarding goals
Before you measure anything, get crystal clear on what success looks like.
Ask yourself: What are you actually trying to achieve? Is it getting users to value quickly? Helping them discover key features? Collecting useful feedback?
You might be tempted to list a dozen goals, but I strongly recommend narrowing it down to just 1-2 primary ones that match where your product is right now and what your users need.
Too many goals will just dilute your focus.
To help prioritize your goals consider:
- Business strategy: Is your company focused towards growth and activation OR towards retention and expansion?
- Customer feedback: Use customer feedback and data to identify where drop-offs happen, so you can course correct.

2. Identify key actions
Once you've got your goals locked in, break down the specific actions users need to take to reach them.
For example, if your goal is reducing time-to-value for a tool like Slack, you'd want to track things like:
- Completing sign-up
- Setting up or joining a workspace
- Inviting teammates
- Sending that first message
But if you're more focused on feature discovery, you'd care more about: whether users explore additional features, if they customize their settings, and how they engage with advanced functionality
The key actions for your product will be unique, but they should always directly connect to your main goals.

3. Determine which actions to measure
Not all user actions deserve equal attention. You need to zero in on measuring the ones that directly impact your onboarding goals and user experience.
For each action you're considering, ask yourself:
- How can we reliably verify this action happened?
- What specific parameters should we use to compare performance?
- What exactly will we measure about this action?
4. Set metrics for each action
For each key action you've identified, establish specific metrics and targets. If your key action is completing a tutorial, your metric might be the percentage of users who complete the tutorial.
But what's a good target? 65%? 80%? 100%? This depends on several factors:
- What your competitors achieve
- Your historical data
- Your product's maturity
- Your revenue strategy
- The user segments you're targeting
For example, if competing apps see 40% of users completing a tutorial within the first week, you might set your benchmark to meet or exceed this rate.
I always tell product teams that benchmarks should be challenging but realistic.
When you're seeking leadership buy-in for your metrics strategy (which you inevitably will), it helps tremendously if you've done your homework on these factors.

5. Prioritize your metrics
With all your metrics defined, you'll likely have more than you can realistically track and analyze deeply. This is where ruthless prioritization comes in.
Rank your metrics based on:
- Their direct importance to your primary onboarding goals
- The impact they have on user experience
- How actionable the insights will be
Don't fall into the trap of tracking everything just because you can. Focus on the few metrics that will drive your decision-making.
6. Track and analyze your metrics
Finally, implement systems to track and analyze your prioritized metrics. The right tools depend on your product's complexity and your organization's maturity.
For most products, tools like Mixpanel, Amplitude, or Google Analytics work well for tracking. You might visualize results with tools like Pendo or Heap.
In more mature organizations, you might work with data engineering teams to build custom dashboards using Tableau or Looker specifically for your onboarding metrics.
The key is using these tools to identify what's working well and what needs improvement in your onboarding process.
Final thoughts
By being data-informed rather than data-led, you'll build products that genuinely serve your users while still hitting your business targets. That balance is what separates the products people love from the ones they quickly abandon.
Remember, metrics should serve your product strategy, not the other way around. When you get this right, both your users and your business win.
Become product-led onboarding certified
Gain a deeper understanding of how to create successful onboarding experiences and increase engagement with Himadri.
By the end of this certification you'll be able to:
- Increase engagement. Understand the best practices to instantly hook your users on your product.
- Strategize. Build a data-backed product onboarding strategy from scratch.
- Gain buy-in. Achieve leadership support by proving the value of a strong product-led onboarding program.
What are you waiting for?