Cross-Tool Flow

From Zero to Data-Driven: Product Analytics Setup

A complete workflow for setting up product analytics: define metrics, implement tracking, validate data quality, create dashboards, and establish monitoring rituals that drive decisions.

Workflow Overview

1
Define Metrics
2
Setup Infrastructure
3
Implement Tracking
4
Validate Data
5
Create Dashboards
6
Monitor & Iterate
2-4 weeks
Initial setup duration
3-5 tools
Tools used across workflow
Ongoing
Continuous monitoring
1

Define Key Metrics & Events

Identify what to measure before implementing anything

πŸ› οΈ Tools for This Phase

Documentation
Notion, Google Docs, Confluence
Collaboration
Miro, FigJam (for mapping)

Step-by-Step Process

  1. 1
    Identify business goals

    What are you trying to achieve? Growth, retention, monetization?

  2. 2
    Define North Star Metric

    The one metric that best represents value delivered to customers

  3. 3
    Map user journey stages

    Awareness β†’ Activation β†’ Engagement β†’ Retention β†’ Revenue

  4. 4
    Define events for each stage

    What user actions indicate progress? Start with 10-20 critical events.

  5. 5
    Create event taxonomy

    Naming convention: Object_Action (e.g., Report_Created, Dashboard_Viewed)

πŸ’‘ Pro Tip

Start with activation metrics first. Most teams over-index on acquisition and under-index on "aha moments." Track what makes users successful, not just what gets them in the door.

πŸ€– AI Prompt Template: Define Events

I'm setting up product analytics for [PRODUCT DESCRIPTION].

Our North Star Metric is: [METRIC]
Our business goal is: [GOAL]

Help me define the critical events to track across this user journey:
1. Awareness (how they find us)
2. Activation (first value delivered)
3. Engagement (ongoing usage)
4. Retention (coming back)
5. Revenue (monetization)

For each stage, suggest:
- 3-5 key events to track
- Event naming using Object_Action format
- Properties to capture with each event
- Why this event matters

Focus on events that indicate user success, not just activity.

How to use: Replace bracketed items. Review AI suggestions and validate with team. Prioritize events that indicate value delivered.

πŸ“€ Output from This Phase

  • βœ“North Star Metric defined
  • βœ“User journey mapped with stages
  • βœ“10-20 critical events documented with properties
  • βœ“Event taxonomy and naming convention established
2

Set Up Tracking Infrastructure

Choose and configure your analytics platform and data pipeline

πŸ› οΈ Tools for This Phase

Analytics Platform
Amplitude, Mixpanel, PostHog
Data Pipeline (Optional)
Segment, RudderStack

Decision: Direct vs CDP

Direct Integration
Send events directly to your analytics tool
Pros: Simpler, faster setup, lower cost
Cons: Harder to switch tools later
Best for: Single analytics tool, early stage
Customer Data Platform
Send events to Segment/RudderStack, distribute to tools
Pros: Easy to add tools, single integration
Cons: More complex, additional cost
Best for: Multiple tools, flexibility

Setup Steps

  1. 1
    Create account and project

    Set up separate projects for dev, staging, production

  2. 2
    Install SDK/library

    Add JavaScript SDK, mobile SDK, or server-side library to your codebase

  3. 3
    Configure user identification

    Set up user ID, device ID, and anonymous tracking

  4. 4
    Set up environments

    Separate production data from test data with API keys or projects

πŸ“€ Output from This Phase

  • βœ“Analytics platform account created with dev/prod projects
  • βœ“SDK installed in codebase
  • βœ“User identification configured
3

Implement Event Tracking

Add tracking code for your defined events

πŸ› οΈ Implementation Approach

Start Small, Iterate

Don't try to track everything at once. Roll out in phases:

  1. 1.Week 1: Core activation events (sign up, first key action)
  2. 2.Week 2: Engagement events (feature usage)
  3. 3.Week 3: Retention and revenue events

Code Example: Event Tracking

// Example: Amplitude event tracking
amplitude.track('Report_Created', {
  report_type: 'sales_dashboard',
  template_used: true,
  data_source: 'salesforce',
  created_by_role: 'manager'
});

// Example: Mixpanel event tracking
mixpanel.track('Dashboard_Viewed', {
  dashboard_id: '12345',
  dashboard_name: 'Executive Summary',
  viewed_by_role: 'admin',
  load_time_ms: 234
});

Best Practices

Use consistent naming

Object_Action format. Dashboard_Viewed, not "user viewed dashboard"

Include context properties

Who, what, where, when. User role, item ID, source page, timestamp.

Track user properties

Plan type, signup date, company sizeβ€”helps with segmentation

Document as you go

Update your tracking plan with actual implementation details

πŸ“€ Output from This Phase

  • βœ“10-20 events implemented in code
  • βœ“Events sending to analytics platform
  • βœ“User and event properties configured
  • βœ“Implementation documented
4

Validate Data Quality

Ensure tracking is working correctly before making decisions

⚠️ Critical Step

Bad data is worse than no data. Teams lose trust in analytics when numbers don't make sense. Invest time here to prevent months of wasted effort.

Validation Checklist

1. Test in Development
  • β€’ Trigger events manually and verify they appear in your analytics tool
  • β€’ Check event names, properties, and values are correct
  • β€’ Verify user identification is working
2. Run QA Tests
  • β€’ Complete full user journeys in staging environment
  • β€’ Verify events fire in correct order
  • β€’ Test edge cases (logged out, errors, mobile vs desktop)
3. Monitor Production Data
  • β€’ Watch for missing events (expected but not firing)
  • β€’ Check for duplicate events (firing multiple times)
  • β€’ Validate volumes match expectations
  • β€’ Compare to existing analytics (GA, logs) for sanity check
4. Set Up Alerts
  • β€’ Critical events drop to zero (broken tracking)
  • β€’ Unusual spikes (bot traffic, bugs)
  • β€’ Property values outside expected range

πŸ“€ Output from This Phase

  • βœ“All events validated in production
  • βœ“Data quality issues identified and fixed
  • βœ“Monitoring alerts configured
  • βœ“Team has confidence in data accuracy
5

Create Dashboards & Reports

Build views that answer your key questions

Essential Dashboards to Create

1. Activation Dashboard

How many users reach their "aha moment"?

  • β€’ Funnel: Sign Up β†’ Key Action β†’ Activation
  • β€’ Time to activation (how long does it take?)
  • β€’ Activation rate by cohort, source, feature

2. Engagement Dashboard

Are users getting value from the product?

  • β€’ Daily/Weekly/Monthly Active Users (DAU/WAU/MAU)
  • β€’ Feature adoption rates
  • β€’ Usage frequency and depth
  • β€’ Power users vs casual users

3. Retention Dashboard

Do users come back?

  • β€’ Retention curves (Day 1, 7, 30, 90)
  • β€’ Cohort retention analysis
  • β€’ Churn indicators

4. North Star Metric Dashboard

Your one metric that matters

  • β€’ Current value and trend
  • β€’ Breakdown by segment
  • β€’ Contributing sub-metrics

πŸ€– AI Prompt: Dashboard Design

I need to design an analytics dashboard for [DASHBOARD PURPOSE].

Our key questions are:
1. [QUESTION 1]
2. [QUESTION 2]
3. [QUESTION 3]

Available events:
[LIST YOUR EVENTS]

Help me design this dashboard:
- What charts/visualizations should I include?
- What metrics should each chart show?
- What segmentations are most valuable?
- What time periods to display?

Focus on actionable insights, not vanity metrics.

πŸ“€ Output from This Phase

  • βœ“4-6 core dashboards created
  • βœ“Team can answer key product questions
  • βœ“Dashboards shared with stakeholders
6

Establish Monitoring Rituals

Make analytics a habit, not a project

Create Data Rituals

Daily: Data Standup (5 min)

  • β€’ Review North Star Metric
  • β€’ Check for anomalies
  • β€’ Flag data quality issues

Weekly: Metrics Review (30 min)

  • β€’ Deep dive on one key metric
  • β€’ Review experiment results
  • β€’ Identify insights and action items

Monthly: Analytics Retrospective (60 min)

  • β€’ Review what you learned
  • β€’ Update tracking for new features
  • β€’ Archive unused dashboards
  • β€’ Plan next month's focus areas

Continuous Improvement

β†’Add tracking for new features as you ship
β†’Refine events based on what you learn
β†’Sunset metrics you don't use
β†’Train team members on analytics tools

πŸ“€ Output from This Phase

  • βœ“Daily/weekly/monthly data rituals established
  • βœ“Team makes data-informed decisions
  • βœ“Analytics evolves with product

Analytics + Core Concepts

Use analytics to apply Value Stream Mapping and Theory of Constraints.

VSM with Analytics

Track time between stages to measure flow:

  • β€’ Sign Up β†’ Activation: How long?
  • β€’ Feature Discovery β†’ Feature Adoption: Where do users drop?
  • β€’ Create funnel reports for each workflow step
  • β€’ Measure wait time vs process time

TOC with Analytics

Find your constraint with data:

  • β€’ Funnel analysis shows where users drop off most
  • β€’ That's your constraintβ€”fix it first
  • β€’ Track improvements: Did conversion increase?
  • β€’ Find the next constraint, repeat

You're Now Data-Driven! πŸ“Š

You've gone from zero to a complete analytics setup.

πŸ“

Clear Metrics

North Star Metric and key events defined and tracked

πŸ“Š

Actionable Dashboards

Activation, engagement, retention reports ready

πŸ”„

Data Rituals

Team reviews data regularly and makes informed decisions

Next Steps

1. Use analytics to validate features from Research to Backlog

2. Apply VSM to find workflow bottlenecks with data

3. Run experiments and measure impact

Share Your Setup Learnings