โ† Back to list
majiayu000

product-analytics

by majiayu000

๐Ÿš€ 39+ battle-tested Claude Code skills & 9 specialized agents for professional software development. The most comprehensive skill library for Claude Code.

โญ 2๐Ÿด 1๐Ÿ“… Jan 25, 2026

SKILL.md


name: product-analytics description: Product analytics and growth expert. Use when designing event tracking, defining metrics, running A/B tests, or analyzing retention. Covers AARRR framework, funnel analysis, cohort analysis, and experimentation.

Product Analytics

Core Principles

  • Metrics over vanity โ€” Focus on actionable metrics tied to business outcomes
  • Data-driven decisions โ€” Hypothesize, measure, learn, iterate
  • User-centric measurement โ€” Track behavior, not just pageviews
  • Statistical rigor โ€” Understand significance, avoid false positives
  • Privacy-first โ€” Respect user data, comply with GDPR/CCPA
  • North Star focus โ€” Align all teams around one key metric

Hard Rules (Must Follow)

These rules are mandatory. Violating them means the skill is not working correctly.

No PII in Events

Events must NEVER contain personally identifiable information.

// โŒ FORBIDDEN: PII in event properties
track('user_signed_up', {
  email: 'user@example.com',     // PII!
  name: 'John Doe',              // PII!
  phone: '+1234567890',          // PII!
  ip_address: '192.168.1.1',     // PII!
  credit_card: '4111...',        // NEVER!
});

// โœ… REQUIRED: Anonymized/hashed identifiers only
track('user_signed_up', {
  user_id: hash('user@example.com'),  // Hashed
  plan: 'pro',
  source: 'organic',
  country: 'US',                       // Broad location OK
});

// Masking utilities
const maskEmail = (email) => {
  const [name, domain] = email.split('@');
  return `${name[0]}***@${domain}`;
};

Object_Action Event Naming

All event names must follow the object_action snake_case format.

// โŒ FORBIDDEN: Inconsistent naming
track('signup');                    // No object
track('newProject');                // camelCase
track('Upload File');               // Spaces and PascalCase
track('user-created');              // kebab-case
track('BUTTON_CLICKED');            // SCREAMING_CASE

// โœ… REQUIRED: object_action snake_case
track('user_signed_up');
track('project_created');
track('file_uploaded');
track('payment_completed');
track('checkout_started');

Actionable Metrics Only

Track metrics that drive decisions, not vanity metrics.

// โŒ FORBIDDEN: Vanity metrics without context
track('page_viewed');               // No insight
track('button_clicked');            // Too generic
track('app_opened');                // Doesn't indicate value

// โœ… REQUIRED: Actionable metrics tied to outcomes
track('feature_activated', {
  feature: 'dark_mode',
  time_to_activation_hours: 2.5,
  user_segment: 'power_user',
});

track('checkout_completed', {
  order_value: 99.99,
  items_count: 3,
  payment_method: 'credit_card',
  coupon_applied: true,
});

Statistical Rigor for Experiments

A/B tests must have proper sample size and significance thresholds.

// โŒ FORBIDDEN: Drawing conclusions too early
// "After 100 users, variant B has 5% higher conversion!"
// This is not statistically significant.

// โœ… REQUIRED: Proper experiment setup
const experimentConfig = {
  name: 'new_checkout_flow',
  hypothesis: 'New flow increases conversion by 10%',

  // Statistical requirements
  significance_level: 0.05,      // 95% confidence
  power: 0.80,                   // 80% power
  minimum_detectable_effect: 0.10, // 10% lift

  // Calculated sample size
  sample_size_per_variant: 3842,

  // Guardrails
  max_duration_days: 14,
  stop_if_degradation: -0.05,    // Stop if 5% worse
};

Quick Reference

When to Use What

ScenarioFramework/ToolKey Metric
Overall product healthNorth Star MetricTime spent listening (Spotify), Nights booked (Airbnb)
Growth optimizationAARRR (Pirate Metrics)Conversion rates per stage
Feature validationA/B TestingStatistical significance (p < 0.05)
User engagementCohort AnalysisDay 1/7/30 retention rates
Conversion optimizationFunnel AnalysisDrop-off rates per step
Feature impactAttribution ModelingMulti-touch attribution
Experiment successStatistical TestingPower, significance, effect size

North Star Metric

Definition

A North Star Metric is the one metric that best captures the core value your product delivers to customers. When this metric grows sustainably, your business succeeds.

Characteristics of Good NSMs

โœ“ Captures product value delivery
โœ“ Correlates with revenue/growth
โœ“ Measurable and trackable
โœ“ Movable by product/engineering
โœ“ Understandable by entire org
โœ“ Leading (not lagging) indicator

Examples by Company

CompanyNorth Star MetricWhy It Works
SpotifyTime Spent ListeningCore value = music enjoyment
AirbnbNights BookedRevenue driver + value delivered
SlackDaily Active TeamsEngagement = product stickiness
FacebookMonthly Active UsersNetwork effect foundation
AmplitudeWeekly Learning UsersValue = analytics insights
DropboxActive Users Sharing FilesCore product behavior

NSM Framework

North Star Metric
       โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚             โ”‚          โ”‚          โ”‚
Input 1    Input 2   Input 3   Input 4
(Supporting metrics that drive NSM)

Example: Spotify
NSM: Time Spent Listening
โ”œโ”€โ”€ Daily Active Users
โ”œโ”€โ”€ Playlists Created
โ”œโ”€โ”€ Songs Added to Library
โ””โ”€โ”€ Share/Social Actions

How to Define Your NSM

  1. Identify core value proposition

    • What job does your product do for users?
    • When do users get "aha!" moment?
  2. Find the metric that represents this value

    • Transaction completed? (e.g., Nights Booked)
    • Time engaged? (e.g., Time Listening)
    • Content created? (e.g., Messages Sent)
  3. Validate it correlates with business success

    • Does NSM increase โ†’ revenue increases?
    • Can product changes move this metric?
  4. Define supporting input metrics

    • What user behaviors drive NSM?
    • Break into 3-5 key inputs

AARRR Framework (Pirate Metrics)

Overview

The AARRR framework tracks the customer lifecycle across five stages:

ACQUISITION โ†’ ACTIVATION โ†’ RETENTION โ†’ REFERRAL โ†’ REVENUE

Stage Definitions

1. Acquisition

When users discover your product

Key Questions:

  • Where do users come from?
  • Which channels have best quality users?
  • What's the cost per acquisition (CPA)?

Metrics:

โ€ข Website visitors
โ€ข App installs
โ€ข Sign-ups per channel
โ€ข Cost per acquisition (CPA)
โ€ข Channel conversion rates

Example Events:

// Landing page view
track('page_viewed', {
  page: 'landing',
  utm_source: 'google',
  utm_medium: 'cpc',
  utm_campaign: 'brand_search'
});

// Sign-up started
track('signup_started', {
  source: 'homepage_cta'
});

2. Activation

When users experience core product value

Key Questions:

  • What's the "aha!" moment?
  • How long to first value?
  • What % reach activation?

Metrics:

โ€ข Time to first action
โ€ข Activation rate (% completing key action)
โ€ข Setup completion rate
โ€ข Feature adoption rate

Example "Aha!" Moments:

Slack:     Send 2,000 messages in team
Twitter:   Follow 30 users
Dropbox:   Upload first file
LinkedIn:  Connect with 5 people

Example Events:

// Activation milestone
track('activated', {
  user_id: 'usr_123',
  activation_action: 'first_project_created',
  time_to_activation_hours: 2.5
});

3. Retention

When users keep coming back

Key Questions:

  • What's Day 1/7/30 retention?
  • Which cohorts retain best?
  • What drives churn?

Metrics:

โ€ข Day 1/7/30 retention rate
โ€ข Weekly/Monthly active users (WAU/MAU)
โ€ข Churn rate
โ€ข Usage frequency
โ€ข Feature stickiness (DAU/MAU)

Retention Calculation:

Day X Retention = Users returning on Day X / Total users in cohort

Example:
Cohort: 1000 users signed up Jan 1
Day 7: 300 returned
Day 7 Retention = 300/1000 = 30%

Example Events:

// Daily engagement
track('session_started', {
  user_id: 'usr_123',
  session_count: 42,
  days_since_signup: 15
});

4. Referral

When users recommend your product

Key Questions:

  • What's the viral coefficient (K-factor)?
  • Which users refer most?
  • What referral incentives work?

Metrics:

โ€ข Viral coefficient (K-factor)
โ€ข Referral rate (% users referring)
โ€ข Invites sent per user
โ€ข Invite conversion rate
โ€ข Net Promoter Score (NPS)

Viral Coefficient:

K = (% users who refer) ร— (avg invites per user) ร— (invite conversion rate)

Example:
K = 0.20 ร— 5 ร— 0.30 = 0.30

K > 1: Viral growth (each user brings >1 new user)
K < 1: Need paid acquisition

Example Events:

// Referral actions
track('invite_sent', {
  user_id: 'usr_123',
  channel: 'email',
  recipients: 3
});

track('referral_converted', {
  referrer_id: 'usr_123',
  new_user_id: 'usr_456',
  channel: 'email'
});

5. Revenue

When users generate business value

Key Questions:

  • What's customer lifetime value (LTV)?
  • What's LTV:CAC ratio?
  • Which segments monetize best?

Metrics:

โ€ข Monthly Recurring Revenue (MRR)
โ€ข Average Revenue Per User (ARPU)
โ€ข Customer Lifetime Value (LTV)
โ€ข LTV:CAC ratio
โ€ข Conversion to paid
โ€ข Revenue churn

LTV Calculation:

LTV = ARPU ร— Gross Margin / Churn Rate

Example:
ARPU: $50/month
Gross Margin: 80%
Churn: 5%/month

LTV = $50 ร— 0.80 / 0.05 = $800

Healthy LTV:CAC ratio: 3:1 or higher

Example Events:

// Revenue events
track('subscription_started', {
  user_id: 'usr_123',
  plan: 'pro',
  mrr: 29.99,
  billing_cycle: 'monthly'
});

track('upgrade_completed', {
  user_id: 'usr_123',
  from_plan: 'basic',
  to_plan: 'pro',
  mrr_change: 20.00
});

AARRR Metrics Dashboard

## Acquisition
- Total visitors: 50,000
- Sign-ups: 2,500 (5% conversion)
- Top channels: Organic (40%), Paid (30%), Referral (20%)

## Activation
- Activated users: 1,750 (70% of sign-ups)
- Time to activation: 3.2 hours (median)
- Activation funnel drop-off: 30% at setup step 2

## Retention
- Day 1: 60%
- Day 7: 35%
- Day 30: 20%
- Churn: 5%/month

## Referral
- K-factor: 0.4
- Users referring: 15%
- Invites per user: 4.2
- Invite conversion: 25%

## Revenue
- MRR: $125,000
- ARPU: $50
- LTV: $800
- LTV:CAC: 4:1
- Conversion to paid: 25%

Key Metrics & Formulas

Engagement Metrics

Daily Active Users (DAU)
= Unique users performing key action per day

Monthly Active Users (MAU)
= Unique users performing key action per month

Stickiness = DAU / MAU ร— 100%
โ€ข 20%+ = Good (users engage 6+ days/month)
โ€ข 10-20% = Average
โ€ข <10% = Low engagement

Session Duration
= Average time between session start and end

Session Frequency
= Average sessions per user per time period

Retention Metrics

Retention Rate (Classic)
= Users active in Week N / Users in original cohort

Retention Rate (Bracket)
= Users active in Week N / Users active in Week 0

Churn Rate
= (Users at start - Users at end) / Users at start

Quick Ratio (Growth Health)
= (New MRR + Expansion MRR) / (Churned MRR + Contraction MRR)
โ€ข >4 = Excellent growth
โ€ข 2-4 = Good
โ€ข <1 = Shrinking

Conversion Metrics

Conversion Rate
= (Conversions / Total visitors) ร— 100%

Funnel Conversion
= (Users completing final step / Users entering funnel) ร— 100%

Time to Convert
= Median time from first touch to conversion

Revenue Metrics

Monthly Recurring Revenue (MRR)
= Sum of all monthly subscription values

Annual Recurring Revenue (ARR)
= MRR ร— 12

Average Revenue Per User (ARPU)
= Total revenue / Number of users

Customer Lifetime Value (LTV)
= ARPU ร— Average customer lifetime (months)
OR
= ARPU ร— Gross Margin % / Monthly Churn Rate

Customer Acquisition Cost (CAC)
= Total sales & marketing spend / New customers acquired

LTV:CAC Ratio
= LTV / CAC
โ€ข >3:1 = Healthy
โ€ข 1:1 = Unsustainable

Payback Period
= CAC / (ARPU ร— Gross Margin %)
โ€ข <12 months = Good
โ€ข 12-18 months = Acceptable
โ€ข >18 months = Concerning

Event Tracking Best Practices

Event Naming Convention

Object + Action pattern (recommended)

โœ“ user_signed_up
โœ“ project_created
โœ“ file_uploaded
โœ“ payment_completed

โœ— signup (unclear)
โœ— new_project (inconsistent)
โœ— Upload File (inconsistent case)

Event Properties Structure

// Standard event structure
{
  event: "checkout_completed",        // Event name
  timestamp: "2025-12-16T10:30:00Z",  // When
  user_id: "usr_123",                 // Who
  session_id: "ses_abc",              // Session context
  properties: {                       // Event-specific data
    order_id: "ord_789",
    total_amount: 99.99,
    currency: "USD",
    item_count: 3,
    payment_method: "credit_card",
    coupon_used: true,
    discount_amount: 10.00
  },
  context: {                          // Global context
    app_version: "2.4.1",
    platform: "web",
    user_agent: "...",
    ip: "192.168.1.1",
    locale: "en-US"
  }
}

Critical Events to Track

## User Lifecycle
- user_signed_up
- user_activated (first key action)
- user_onboarded (completed setup)
- user_upgraded (plan change)
- user_churned (canceled/inactive)

## Feature Usage
- feature_viewed
- feature_used
- feature_completed

## Commerce
- product_viewed
- product_added_to_cart
- checkout_started
- payment_completed
- order_fulfilled

## Engagement
- session_started
- session_ended
- page_viewed
- search_performed
- content_shared

## Errors
- error_occurred
- payment_failed
- api_error

Privacy & Compliance

// โœ“ GOOD: No PII in events
track('user_signed_up', {
  user_id: hashUserId('user@example.com'),  // Hashed
  plan: 'pro',
  source: 'organic'
});

// โœ— BAD: Contains PII
track('user_signed_up', {
  email: 'user@example.com',  // PII!
  password: '...',            // Never log!
  credit_card: '...'          // Never log!
});

// Masking strategies
const maskEmail = (email) => {
  const [name, domain] = email.split('@');
  return `${name[0]}***@${domain}`;
};

const maskCard = (card) => `****${card.slice(-4)}`;

See Also

Score

Total Score

75/100

Based on repository quality metrics

โœ“SKILL.md

SKILL.mdใƒ•ใ‚กใ‚คใƒซใŒๅซใพใ‚Œใฆใ„ใ‚‹

+20
โœ“LICENSE

ใƒฉใ‚คใ‚ปใƒณใ‚นใŒ่จญๅฎšใ•ใ‚Œใฆใ„ใ‚‹

+10
โœ“่ชฌๆ˜Žๆ–‡

100ๆ–‡ๅญ—ไปฅไธŠใฎ่ชฌๆ˜ŽใŒใ‚ใ‚‹

+10
โ—‹ไบบๆฐ—

GitHub Stars 100ไปฅไธŠ

0/15
โœ“ๆœ€่ฟ‘ใฎๆดปๅ‹•

1ใƒถๆœˆไปฅๅ†…ใซๆ›ดๆ–ฐ

+10
โ—‹ใƒ•ใ‚ฉใƒผใ‚ฏ

10ๅ›žไปฅไธŠใƒ•ใ‚ฉใƒผใ‚ฏใ•ใ‚Œใฆใ„ใ‚‹

0/5
โœ“Issue็ฎก็†

ใ‚ชใƒผใƒ—ใƒณIssueใŒ50ๆœชๆบ€

+5
โœ“่จ€่ชž

ใƒ—ใƒญใ‚ฐใƒฉใƒŸใƒณใ‚ฐ่จ€่ชžใŒ่จญๅฎšใ•ใ‚Œใฆใ„ใ‚‹

+5
โœ“ใ‚ฟใ‚ฐ

1ใคไปฅไธŠใฎใ‚ฟใ‚ฐใŒ่จญๅฎšใ•ใ‚Œใฆใ„ใ‚‹

+5

Reviews

๐Ÿ’ฌ

Reviews coming soon