Business Cases & Product Sense

Metric Definition Framework

4 min read

"How would you measure success for [feature]?" is one of the most common data science interview questions. Your ability to define the right metrics shows product intuition and analytical rigor.

The AARRR Framework (Pirate Metrics)

A classic framework for understanding user lifecycle metrics:

Stage Meaning Example Metrics
Acquisition How users find you Sign-ups, app downloads, landing page visitors
Activation First "aha" moment Completed onboarding, first action, profile setup
Retention Users coming back DAU/MAU ratio, 7-day retention, 30-day retention
Revenue Monetization ARPU, conversion to paid, LTV
Referral Users inviting others Invite sent, viral coefficient

Interview application: When asked to define metrics, walk through each AARRR stage for the feature.

North Star Metrics

The single metric that best captures value delivered to customers:

Company Type North Star Metric Why
Marketplace Transactions completed Shows both sides are happy
SaaS Active usage (DAU/WAU) Indicates ongoing value
Media Time spent Attention = value
E-commerce Purchase frequency Repeat customers = success

Interview question: "What's the North Star metric for Spotify?"

Good answer: "I'd say it's 'time spent listening per user.' This captures both that users are finding content they want (music/podcasts) and that they're choosing Spotify over alternatives. Revenue follows from engaged users."

Leading vs Lagging Indicators

Lagging indicators: Measure outcomes (what already happened)

  • Revenue
  • Churn rate
  • Net Promoter Score

Leading indicators: Predict outcomes (early warning signals)

  • Feature adoption rate
  • Support ticket volume
  • Page load time

Interview insight: "I'd track [leading metric] because it gives us faster feedback than waiting for [lagging metric]. If [leading] drops, we can act before [lagging] is affected."

Counter-Metrics and Guardrails

Every metric can be gamed. Define counter-metrics to prevent bad behavior:

Primary Metric Potential Gaming Counter-Metric
Click-through rate Clickbait headlines Time on page after click
Support ticket resolution time Closing tickets too fast Customer satisfaction score
Sign-ups Low-quality users 7-day retention
Revenue One-time discounts LTV, repeat purchase rate

Framework for any metric question:

  1. What are we trying to achieve? (business goal)
  2. What behavior indicates success? (user action)
  3. How could this metric be gamed? (counter-metric needed)
  4. What's the tradeoff? (guardrail metric)

Interview Example: Measuring Instagram Reels Success

Question: "How would you measure the success of Instagram Reels?"

Strong answer structure:

"I'd approach this at multiple levels:

North Star: Time spent watching Reels (captures user value)

AARRR breakdown:

  • Acquisition: Users who discover Reels (from feed, explore, stories)
  • Activation: First Reel watched to completion
  • Retention: Return visits to Reels tab within 7 days
  • Revenue: Ad views, branded content engagement
  • Referral: Reels shared to other platforms or DMs

Counter-metrics:

  • Time spent shouldn't come from addictive dark patterns → track user sentiment/surveys
  • Completion rate shouldn't be gamed by short videos → normalize by video length

Guardrails:

  • Overall Instagram time spent (Reels shouldn't cannibalize)
  • Creator publishing rate (ecosystem health)
  • Ad revenue per user (monetization)"

Always think about metrics as a system, not just a single number. :::

Quiz

Module 5: Business Cases & Product Sense

Take Quiz