Sales-led companies track pipeline, win rates, and quota attainment. Product-led companies need a fundamentally different measurement system because the product is the primary driver of acquisition, activation, and expansion.
The problem is that most PLG teams either track too few metrics (just MRR and signups) or too many (a 40-metric dashboard nobody reads). This post lays out the 12 metrics that actually matter, organized into four layers, with benchmarks and practical guidance on how to improve each one.
For a broader overview of PLG metrics, start there. This post goes deeper into each metric and how to build a dashboard around them.
Why PLG needs different metrics
In a sales-led model, a human guides the buyer through evaluation, negotiation, and onboarding. The metrics reflect that: pipeline velocity, average deal size, sales cycle length.
In PLG, the product does most of that work. Users self-discover, self-evaluate, and self-onboard. The metrics need to capture how well the product performs each of those jobs:
- Acquisition: Is the product attracting the right users?
- Activation: Are new users reaching value quickly enough?
- Engagement: Are activated users building habits?
- Revenue: Is usage converting to revenue and expanding over time?
These four layers form the structure of your PLG dashboard.
Layer 1: Acquisition metrics
1. Signup rate
Definition: The percentage of website visitors who create an account.
Benchmark: 2-5% for freemium products, 8-15% for free trial products with strong intent.
How to improve:
- Reduce signup form fields to the absolute minimum (email only, or SSO)
- Show the product in action on your landing page (screenshots, interactive demos, video)
- A/B test your CTA copy. "Start free" consistently outperforms "Request a demo" for PLG
- Remove credit card requirements for trial signups
Why it matters: A low signup rate means either your messaging is off, your target audience is wrong, or your signup flow has too much friction. Fix this before optimizing anything downstream.
2. Traffic-to-signup conversion
Definition: Unique visitors divided by new signups, segmented by acquisition channel.
Benchmark: Varies wildly by channel. Organic search: 2-4%. Paid ads: 3-7%. Product Hunt launch: 10-20% (short-lived).
How to improve:
- Segment by channel to find which sources bring high-intent traffic
- Create dedicated landing pages for each major acquisition channel
- Align ad copy with the actual product experience to reduce expectation gaps
Layer 2: Activation metrics
3. Activation rate
Definition: The percentage of new signups who complete the key actions required to experience core product value. See the full definition at activation rate.
Benchmark: 20-40% is typical. Best-in-class PLG companies achieve 40-60%.
How to improve:
- Define a clear activation event (not "logged in" but "completed first meaningful action")
- Reduce the number of steps between signup and activation
- Add onboarding checklists that guide users to activation milestones
- Use progressive disclosure: show only what the user needs now, hide advanced features
Why it matters: This is the single most important metric on this list. Users who activate retain at 3-5x the rate of users who do not. If you can only improve one number, improve this one.
4. Time-to-value (TTV)
Definition: The elapsed time between a user signing up and completing their activation event.
Benchmark: Under 5 minutes for simple tools. Under 24 hours for complex B2B products. Under 3 days for products requiring data imports or integrations.
How to improve:
- Pre-populate accounts with sample data so users see value before importing their own
- Offer templates, wizards, or guided setup flows
- Defer non-essential configuration (settings, billing, profile details) until after activation
- Track TTV by cohort to see if product changes are actually reducing it
Layer 3: Engagement metrics
5. DAU/MAU ratio
Definition: Daily active users divided by monthly active users. A measure of how frequently users return. More detail at DAU/MAU ratio.
Benchmark: 10-20% for most B2B SaaS. 30-50% for daily-use tools (Slack, Figma, Linear). Above 50% is exceptional.
How to improve:
- Build features that create daily usage habits (notifications, digests, dashboards)
- Identify your most engaged users and study what they do differently
- Add collaborative features that give users reasons to return (comments, shared workspaces)
Why it matters: A high DAU/MAU ratio means your product is part of the user's daily workflow. A low ratio means users check in occasionally but it is not sticky enough to drive reliable retention or expansion.
6. Feature adoption rate
Definition: The percentage of active users who use a specific feature within a given time period.
Benchmark: Core features should be used by 60-80% of active users. Secondary features by 20-40%. If a feature is used by less than 5% of users, question whether it should exist.
How to improve:
- Surface underused features contextually (when the user would benefit from them, not on first login)
- Add feature discovery prompts after users complete related actions
- Remove or simplify features with near-zero adoption
7. Breadth of use
Definition: The number of distinct features or modules used per user or account over a 30-day period.
Benchmark: Varies by product complexity. Track the trend, not the absolute number.
How to improve:
- Cross-promote related features within the product
- Create workflows that naturally lead users through multiple features
- Use onboarding to expose users to 3-4 core features, not just one
Layer 4: Revenue metrics
8. Trial-to-paid conversion
Definition: The percentage of free trial users who convert to a paid plan.
Benchmark: 2-5% for freemium models. 15-25% for time-limited free trials. Opt-out trials (credit card required upfront) convert at 40-60% but have lower signup rates.
How to improve:
- Ensure users hit activation before the trial expires
- Send trial expiration reminders with a summary of value received
- Offer annual billing at a discount to reduce decision friction
- Consider extending trials for users who are active but have not activated yet
9. Expansion revenue
Definition: Revenue generated from existing customers through upgrades, add-ons, or increased usage. See expansion revenue for the full definition.
Benchmark: Expansion should represent 20-40% of new ARR for mature PLG companies.
How to improve:
- Design usage limits that align with value (when users hit limits, they have already experienced enough value to justify paying more)
- Offer team plans with per-seat pricing that grows with adoption
- Build features that unlock at higher tiers and are visible but gated in lower tiers
- Track which features drive upgrades and invest in those
10. Net revenue retention (NRR)
Definition: The percentage of revenue retained from existing customers after accounting for expansion, contraction, and churn. Full definition at NRR.
Benchmark: 100% is break-even. 110-130% is strong. Best-in-class PLG companies hit 130-150%.
How to improve:
- Focus on expansion revenue (covered above)
- Reduce involuntary churn (failed payments, expired cards)
- Identify at-risk accounts early using product usage signals
- Build retention loops that keep users engaged over time
11. Revenue per user (ARPU)
Definition: Total revenue divided by total users (or accounts) in a given period.
Benchmark: Depends entirely on your market. Track the trend and segment by cohort.
How to improve:
- Identify your highest-ARPU segments and acquire more of them
- Build pricing tiers that capture willingness-to-pay across segments
- Increase feature usage breadth, which correlates with higher ARPU
12. Payback period
Definition: The number of months it takes to recover the cost of acquiring a customer.
Benchmark: Under 12 months for PLG. Under 6 months is excellent. PLG companies typically have shorter payback periods than sales-led because CAC is lower.
How to improve:
- Reduce CAC by investing in organic acquisition channels
- Increase first-year revenue through faster activation and earlier expansion
- Shorten time-to-paid by optimizing trial-to-paid conversion
Building the dashboard
Tool selection
You do not need a custom-built analytics platform. Most PLG teams use one of these:
- Amplitude: Best for product analytics with built-in funnel and cohort analysis. Strong for activation and engagement metrics.
- Mixpanel: Similar to Amplitude, with good event-based tracking. Slightly easier to set up for smaller teams.
- PostHog: Open-source alternative with session recording, feature flags, and product analytics in one platform. Good for teams that want to own their data.
- Looker / Metabase + data warehouse: Best for teams that want full control over metric definitions and need to combine product data with revenue data from Stripe or billing systems.
Dashboard layout
Organize your dashboard into the four layers. For each metric, show:
- Current value (this month or rolling 30 days)
- Trend (last 3-6 months, line chart)
- Segmentation (by plan, by acquisition channel, by user role)
Put activation rate and NRR at the top. These are the two metrics that most directly predict long-term PLG health.
Review cadence
- Weekly: Signup rate, activation rate, trial-to-paid conversion
- Monthly: All 12 metrics, with cohort analysis on activation and retention
- Quarterly: Deep dive into NRR decomposition (expansion, contraction, churn) and payback period trends
Common mistakes
Tracking too many metrics: If your dashboard has more than 15 metrics, nobody will look at it. Start with the 12 listed here and add only when you have a specific question a new metric would answer.
Vanity metrics: Signups, page views, and total registered users feel good but do not indicate product health. Always pair volume metrics with quality metrics (signup rate with activation rate, total users with DAU/MAU).
No segmentation: Averages hide the truth. A 25% activation rate might mean enterprise users activate at 45% and SMB users at 10%. Without segmentation, you will optimize for the wrong audience.
Measuring too late: If you wait until revenue metrics are declining to investigate, you have missed 2-3 months of upstream signals. Activation rate and engagement metrics are leading indicators. Revenue is lagging.
Not connecting metrics to actions: Every metric on your dashboard should have an owner and a hypothesis for how to improve it. A dashboard without accountability is just a screensaver.
The best PLG dashboards are not the ones with the most metrics. They are the ones where every number on the screen is connected to a specific team, a specific initiative, and a specific expected outcome.