A focused PM metrics exercise: north star definition, supporting hierarchy, explicit rejections, and guardrails for two products solving collaborative work management in fundamentally different ways.
This is not an endorsement of how either company actually measures itself — it's how I'd think through the metric question if asked in an interview or brought onto the team.
Step 1
Asana wants to be the operating layer for how work gets done across teams — not just a task list, but the coordination system that replaces the status-update meeting. The metric question is: are we measuring tasks, or actual team coordination?
Step 2
North Star
Weekly Active Teams (WAT)
Definition
Teams with ≥2 members that complete ≥5 coordinated tasks through Asana in a rolling 7-day window.
Asana's core value is coordination, not individual productivity. A team of 10 running their weekly cycles in Asana is worth orders of magnitude more than 10 individual users tracking personal to-dos. Weekly captures the natural rhythm of work — teams coordinate in weekly sprints, not daily or monthly cycles. The task completion floor (≥5) filters out ghost teams who open Asana but don't actually run work through it.
Step 3
The metrics that feed into the north star, organized by what they measure and when they signal a problem.
Setup & Adoption
Engagement Depth
Retention & Expansion
Step 4
Explicit rejections matter as much as what you pick. A bad north star optimizes for the wrong thing — often while looking great on a dashboard.
Monthly Active Users
Users can log in, check a task, and leave without coordinating anything. An individual checking off a personal to-do doesn't represent the coordination value Asana's pricing and positioning is built on.
Tasks Created
Creating tasks without completing or coordinating them is a pure vanity metric. It measures input, not outcome. A team can create 100 tasks and abandon every one of them.
NPS
Sentiment without behavior. A team that rates Asana 9/10 but only opens it twice a month is not the retained customer the business needs. NPS tells you how people feel, not how they work.
ARR
Revenue is a lagging indicator by 6–12 months. By the time ARR drops, the coordination behavior you needed to protect disappeared long ago. Don't manage a leading problem with a lagging metric.
Step 5
Counter-metrics that prevent gaming the north star. Every metric can be gamed — guardrails are how you make gaming visible.
⚠ Team quality floor
Don't count teams where only 1 member is genuinely active. WAT requires real multi-person coordination — not solo users on team licenses. Track single-active-member teams separately as a churn risk signal.
⚠ Board abandonment rate
Projects set up but never actively used after week 1. A spike in WAT alongside a spike in abandonment means teams are trying and leaving, not converting. These two metrics should be read together.
⚠ Template dependency ratio
Teams using only pre-built templates without ever building custom workflows haven't internalized Asana into how they actually work. This is shallow adoption — it looks like engagement but doesn't retain.
Step 6
Public data points that inform how the north star is likely trending — and what questions I'd be asking if I were on the team.
ARR (FY2024)
~$721M
Publicly reported
Revenue growth
~10% YoY
Down from 35% in FY2022
NRR direction
Compressing
Was >115%, now under pressure
Fastest-growing segment
>$50K deals
Moving upmarket intentionally