Workbook

Make the Mission Yours

Role: Product Analyst

Use these activities to apply each principle to your current product, service, or project. These activities are a sample to get you started, not an exhaustive list. Adapt and expand them based on your team's context and needs. Capture your answers, share them with your team, and revisit them as you learn.

⚠️

Important: When Using AI Tools

When using AI-assisted activities, always double-check for accuracy and meaning each and every time. AI tools can help accelerate your work, but human judgment, validation, and critical thinking remain essential.

Review AI-generated content with your team, validate it against real user feedback and domain knowledge, and ensure it truly serves your mission and user outcomes before proceeding.

1) Shared Mission and Vision

Align your analysis to mission outcomes.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 1) Shared Mission and Vision section in the framework.

Workbook Activities (do now)

  • ☐Put the mission metric at the top of your main dashboard and annotate why it matters.
  • ☐List the user outcomes for current work and map which metrics signal movement.
  • ☐Review mission/outcomes with PM/Eng to confirm the questions you should answer.
  • ☐Rewrite one analysis request in mission terms and share back with the requester.
  • ☐Flag one vanity metric and replace it with an outcome-linked metric in your next report.

AI Assisted Activities

  • ☐Use AI to help draft mission-aligned dashboards or analysis frameworks, but have your team review and refine them to ensure they reflect real user needs and business goals.
  • ☐Ask AI to generate potential outcome metrics for your analysis, then validate each one against direct user feedback and domain knowledge before implementing.
  • ☐Use AI to help structure your mission/outcome mappings in dashboards, but ensure human team members validate that each metric truly serves the mission before tracking.
  • ☐Have AI analyze past analyses to identify mission alignment patterns, then use those insights in team discussions to improve how data connects to user outcomes.

Evidence of Progress

  • ☐Dashboards start with mission/outcome metrics.
  • ☐Analyses explicitly tie back to mission questions.

2) Break Down Silos

Co-define events, metrics, and questions with PM/Eng.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 2) Break Down Silos section in the framework.

Workbook Activities (do now)

  • ☐Run a tracking design session with PM/Eng before implementation; document events and owners.
  • ☐Create a shared metric dictionary and review with QA for validation checks.
  • ☐Join a design review to ensure measurement covers key user behaviors.
  • ☐Hold a 10-minute validation with QA/Eng to confirm events in staging before launch.
  • ☐Replace an async metric debate with a live alignment on definitions for this release.

AI Assisted Activities

  • ☐When AI generates tracking plans or metric definitions, have cross-functional team members (PM, engineering, QA) review them together to ensure they serve users and align with mission.
  • ☐Use AI to help draft metric dictionaries or tracking designs, but ensure all roles contribute their perspectives during the actual tracking design sessions.
  • ☐Have AI analyze tracking patterns and metric gaps, then use those insights in cross-functional discussions to improve collaboration.
  • ☐Use AI to help structure metric collaboration sessions, but ensure human team members make decisions together about what to measure and how it serves users.

Evidence of Progress

  • ☐Events/metrics are defined before build and validated after.
  • ☐Fewer tracking gaps found post-release.

3) User Engagement

Ground your analysis in real user behavior and voice.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 3) User Engagement section in the framework.

Workbook Activities (do now)

  • ☐Shadow a support or sales call; capture phrases that explain user intent.
  • ☐Pair one usability session with a metric you own; note alignment or mismatch.
  • ☐Translate a top user complaint into a measurable question and instrument it.
  • ☐Add one user quote to your next chart to anchor the story.
  • ☐Validate a surprising metric trend with a quick qualitative check (support, PM, or user clip).

AI Assisted Activities

  • ☐Use AI to analyze user feedback, support tickets, or usage data to identify patterns for analysis, but always validate AI insights through direct user engagement or observation.
  • ☐Have AI generate questions for user interviews based on your data assumptions, then use those questions in real conversations with users to build genuine empathy.
  • ☐Use AI to help summarize user research findings for analysis, but ensure you review the summaries and add your own observations from direct user interactions.
  • ☐Have AI analyze user behavior patterns from your data, then discuss those patterns with actual users to understand the "why" behind the behavior before finalizing analysis.

Evidence of Progress

  • ☐You cite user quotes alongside charts.
  • ☐A new instrument/metric came directly from user observation.

4) Outcomes Over Outputs

Deliver concise outcome readouts and next steps.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 4) Outcomes Over Outputs section in the framework.

Workbook Activities (do now)

  • ☐For a release, publish a one-page outcome readout: metric movement, why, next step.
  • ☐Flag when outcomes did not move; propose one hypothesis to test next.
  • ☐Add a simple β€œdecision log” to your report noting what changed because of the data.
  • ☐Create two versions of the readout: exec (outcomes/actions) and squad (details/tests).
  • ☐If the outcome missed, propose a concrete experiment and timing to validate the hypothesis.

AI Assisted Activities

  • ☐When AI generates analysis reports or outcome readouts, define outcome metrics upfront and measure whether AI-generated insights achieve intended user outcomes, not just data completeness.
  • ☐Use AI to help analyze outcome data and identify patterns, but have human team members interpret what those patterns mean for users and the mission.
  • ☐Have AI help draft outcome definitions and success criteria for your analyses, but ensure the team validates them against real user needs and business goals before proceeding.
  • ☐Use AI to track and report on outcome metrics, but schedule human team reviews to discuss what the metrics mean and how to adjust analysis based on observed impact.

Evidence of Progress

  • ☐Reports include clear next actions tied to outcomes.
  • ☐Decisions reference your data and hypotheses.

5) Domain Knowledge

Understand data lineage, trust, and the business ecosystem.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 5) Domain Knowledge section in the framework.

Workbook Activities (do now)

  • ☐Document data lineage for one key metric (sources, transforms, owners).
  • ☐Annotate data quality caveats and include them in dashboards.
  • ☐Map front/back stage data touchpoints for a core journey and mark weakest links.
  • ☐Review one regulatory/privacy constraint with security and add a note to the metric definition.
  • ☐Identify the riskiest upstream data source for this report and add a monitor or caveat.

AI Assisted Activities

  • ☐Use AI to help summarize domain documentation, data lineage, or business constraints for analysis, but validate AI-generated domain knowledge through direct engagement with domain experts.
  • ☐Have AI generate questions about domain constraints or data ecosystem relationships, then use those questions in conversations with domain experts to build deep understanding.
  • ☐Use AI to help draft data lineage maps or metric definitions, but ensure team members review them with domain experts to verify accuracy and completeness.
  • ☐Have AI analyze past analyses or domain-related data issues, then discuss those insights with the team and domain experts to identify patterns and prevent similar problems.

Evidence of Progress

  • ☐Metric definitions include lineage and quality notes.
  • ☐Stakeholders know which metrics are trustworthy for decisions.

6) The Art of Storytelling

Turn data into stories people can act on.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 6) The Art of Storytelling section in the framework.

Workbook Activities (do now)

  • ☐Present one insight as a user story with a chart and a quote that illustrates the behavior.
  • ☐Create two versions of a key finding: a deep dive for the squad and a one-slide exec summary.
  • ☐Frame one metric trend as a before/after narrative tied to a release.
  • ☐Add one user clip or verbatim to the slide to make the insight tangible.
  • ☐Record a 60-second voiceover of a key chart explaining what changed, why, and what to do next.

AI Assisted Activities

  • ☐Use AI to help structure or draft data stories and analysis narratives, but refine them with real user anecdotes, emotions, and personal observations from direct user interactions.
  • ☐Have AI generate different versions of data insights for different audiences (technical peers vs executives), but ensure each version includes authentic human stories about real user impact.
  • ☐Use AI to help summarize data work in presentations, but lead with human stories about real users, using AI-generated summaries as supporting material.
  • ☐Have AI help draft analysis reports or data presentations, but always include real user quotes, data points, or anecdotes that connect your analysis to human impact.

Evidence of Progress

  • ☐Stakeholders can retell your finding accurately.
  • ☐Teams take action based on your story-backed insights.