Workbook

Make the Mission Yours

Role: Software Developer

Use these activities to apply each principle to your current product, service, or project. These activities are a sample to get you started, not an exhaustive list. Adapt and expand them based on your team's context and needs. Capture your answers, share them with your team, and revisit them as you learn.

⚠️

Important: When Using AI Tools

When using AI-assisted activities, always double-check for accuracy and meaning each and every time. AI tools can help accelerate your work, but human judgment, validation, and critical thinking remain essential.

Review AI-generated content with your team, validate it against real user feedback and domain knowledge, and ensure it truly serves your mission and user outcomes before proceeding.

1) Shared Mission and Vision

Tie your code to mission and user outcomes so you can make decisions without waiting on direction.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 1) Shared Mission and Vision section in the framework.

Workbook Activities (do now)

  • ☐Rewrite the mission in one paragraph and list two user outcomes your current feature must move.
  • ☐Annotate the ticket you are starting with the specific user job/task it helps and the signal you will watch.
  • ☐Add a β€œwhy this matters” note to your PR description referencing the user outcome and expected behavior change.
  • ☐In standup, state one implementation choice you’ll make differently because of the mission/outcome.
  • ☐Pair with a teammate to sanity-check that your chosen approach still serves the stated outcome.

AI Assisted Activities

  • ☐Use AI to help draft mission statements or outcome mappings for your features, but have your team review and refine them to ensure they reflect real user needs.
  • ☐Ask AI to generate potential user outcomes for your code changes, then validate each one against direct user feedback and domain knowledge before implementing.
  • ☐Use AI to help structure your "why this matters" notes in PRs, but ensure human team members validate that each change truly serves the mission before merging.
  • ☐Have AI analyze past PR descriptions to identify mission alignment patterns, then use those insights in team discussions to improve how code connects to user outcomes.

Evidence of Progress

  • ☐You can explain any PR in terms of the user outcome it serves.
  • ☐Your PRs include a short β€œwhy” tied to mission/user outcome.

2) Break Down Silos

Work alongside design, QA, and ops instead of throwing work over the wall.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 2) Break Down Silos section in the framework.

Workbook Activities (do now)

  • ☐Pair with QA to co-create acceptance tests and edge cases before coding.
  • ☐Run a 15-minute pre-build huddle with design and PM to review intent, data needs, and constraints.
  • ☐Invite DevOps/ops to review rollout, health checks, and rollback for this change.
  • ☐Share a WIP snippet with design/QA to catch integration issues early (e.g., states, data, error UX).
  • ☐Replace one async back-and-forth with a live co-working block to finish a tricky piece together.

AI Assisted Activities

  • ☐When AI generates code, have cross-functional team members (design, QA, ops) review it together to ensure it serves users and integrates well with the system.
  • ☐Use AI to help draft context sync agendas or technical documentation, but ensure all roles contribute their perspectives during the actual sync.
  • ☐Have AI analyze code review patterns to identify handoff friction, then use those insights in cross-functional discussions to improve collaboration.
  • ☐Use AI to help structure collaboration sessions or pair programming, but ensure human team members make decisions together about what to build and how it serves users.

Evidence of Progress

  • ☐Fewer post-handoff clarifications or reopenings on the story you paired on.
  • ☐Acceptance tests match what QA runs, with fewer surprises after merge.

3) User Engagement

Keep a line of sight to real users so you build with empathy.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 3) User Engagement section in the framework.

Workbook Activities (do now)

  • ☐Observe or replay one support/usability session; write three takeaways and one code change you will make now.
  • ☐Instrument one behavior you are assuming (log/metric) and review it post-release.
  • ☐Rewrite the current user story as a short narrative (before/after) and share it with the team.
  • ☐Pair with a designer/PM to validate a user edge case you surfaced; note the decision in the ticket.
  • ☐Shadow a support query relevant to your feature and capture the exact user phrasing to guide your UX/error handling.

AI Assisted Activities

  • ☐Use AI to analyze user feedback, support tickets, or error logs to identify patterns, but always validate AI insights through direct user observation or usability testing.
  • ☐Have AI generate questions for user interviews based on your code assumptions, then use those questions in real conversations with users to build genuine empathy.
  • ☐Use AI to help summarize user research findings related to your features, but ensure you review the summaries and add your own observations from direct user interactions.
  • ☐Have AI analyze user behavior patterns from your instrumentation, then discuss those patterns with actual users to understand the "why" behind the behavior before making code changes.

Evidence of Progress

  • ☐You can cite a user interaction that changed an implementation choice.
  • ☐You added instrumentation and reviewed it post-release.

4) Outcomes Over Outputs

Measure success by user/business impact, not just β€œdone”.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 4) Outcomes Over Outputs section in the framework.

Workbook Activities (do now)

  • ☐Pick one metric you can influence for this feature (e.g., task time, error rate) and add it to the PR.
  • ☐Write the expected behavior change and when/how you will observe it after release.
  • ☐Add a β€œdefinition of done + outcome” checklist to your PR: signal, measure, rollback trigger.
  • ☐After release, post a short readout comparing expected vs. observed and propose one follow-up action.
  • ☐If the outcome missed, log one hypothesis and a code/config tweak you will try next.

AI Assisted Activities

  • ☐When AI generates code or features, define outcome metrics upfront in your PR and measure whether AI-generated code achieves intended user outcomes, not just technical completion.
  • ☐Use AI to help analyze outcome data from your instrumentation and identify patterns, but have human team members interpret what those patterns mean for users and the mission.
  • ☐Have AI help draft outcome definitions and success criteria for your code changes, but ensure the team validates them against real user needs and business goals before merging.
  • ☐Use AI to track and report on outcome metrics from your code, but schedule human team reviews to discuss what the metrics mean and how to adjust code based on observed impact.

Evidence of Progress

  • ☐Each shipped PR references an outcome and a follow-up check.
  • ☐You've adjusted code/config based on observed outcomes, not only bug reports.

5) Domain Knowledge

Understand upstream/downstream and the service ecosystem to make better technical decisions.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 5) Domain Knowledge section in the framework.

Workbook Activities (do now)

  • ☐Sketch a quick sequence/data-flow for this feature showing upstream/downstream calls, owners, and failure modes.
  • ☐Create a simple front-stage/back-stage map for the flow and mark where your code supports the user experience.
  • ☐Meet a domain expert (support/ops/policy) to confirm a constraint that should change your design; note it in the ticket.
  • ☐Review one past incident tied to this domain and list a guardrail you will add to this change.
  • ☐Call out an integration or policy constraint in your PR and tag the owner for confirmation.

AI Assisted Activities

  • ☐Use AI to help summarize domain documentation, API contracts, or system architecture docs, but validate AI-generated domain knowledge through direct engagement with domain experts and code reviews.
  • ☐Have AI generate questions about domain constraints or ecosystem relationships for your code, then use those questions in conversations with domain experts to build deep understanding.
  • ☐Use AI to help draft sequence diagrams or system maps, but ensure team members review them with domain experts to verify accuracy and completeness before implementing.
  • ☐Have AI analyze past incidents or domain-related bugs, then discuss those insights with the team and domain experts to identify patterns and prevent similar problems in your code.

Evidence of Progress

  • ☐You can describe how your change affects upstream/downstream systems.
  • ☐Your design notes/PRs call out domain or policy constraints explicitly.

6) The Art of Storytelling

Explain your work in human terms to align and inspire.

πŸ’‘

Learn More

For more information and deeper understanding of this principle, refer to the 6) The Art of Storytelling section in the framework.

Workbook Activities (do now)

  • ☐Dinner-table test: describe this feature in two sentences focused on the user problem and impact.
  • ☐Write a before/after story for this feature and share it at demo with a user quote or data point.
  • ☐Prepare two updates: one technical (architecture/trade-offs) and one stakeholder-friendly (impact/next step).
  • ☐Record a 60-second story (user, pain, change, result) and post it in the team channel.
  • ☐Add a single slide/snippet that shows the user task improved (e.g., time saved, error avoided) and use it in your next review.

AI Assisted Activities

  • ☐Use AI to help structure or draft PR descriptions and technical stories, but refine them with real user anecdotes, emotions, and personal observations from direct user interactions.
  • ☐Have AI generate different versions of code explanations for different audiences (technical peers vs stakeholders), but ensure each version includes authentic human stories about real user impact.
  • ☐Use AI to help summarize technical work in demos, but lead presentations with human stories about real users, using AI-generated summaries as supporting material.
  • ☐Have AI help draft code documentation or release notes, but always include real user quotes, data points, or anecdotes that connect your code to human impact.

Evidence of Progress

  • ☐Teammates reuse your story to explain the value of the feature.
  • ☐Stakeholders can retell your update without losing the point.