My Process

The
process.

Every engagement follows a disciplined sequence — from listening to shipping to repeating. Not a framework that gets handed off. A process I run alongside the team until the outcomes are real.

01
Discovery

Listen first.

Before forming any opinions, I spend time understanding the business by listening. Stakeholder interviews across product, tech, sales, support, and operations. Customer conversations. Wherever the real signal lives. This phase takes roughly 30 days and has one rule: no recommendations until the listening is done.

The headwinds, the gaps, the things quietly going wrong — they surface in conversations, not in dashboards. This is where I earn the right to have an opinion.

Outcome A clear picture of where value is being created, where it's leaking, and what's actually in the way.
02
Synthesis

Map the opportunity.

Discovery surfaces themes. I organize those themes into a living opportunity map — clusters of related bets tied directly to business outcomes. It's not a finished artifact and it's not meant to be. It moves as we learn. But it gives everyone a shared picture of where to play, in what order, and why.

Each cluster becomes a bucket of work: known issues, opportunities, and hypotheses about what will move the needle. These buckets are the raw material for everything that follows.

Outcome A thematic opportunity map with prioritized bets broken down into known issues, desired outcomes, and areas to test.
03
Prioritization

Build to learn.

The opportunity map becomes a force-ranked set of hypotheses and experiments. Cross-functional exercises, data deep dives, customer ticket analysis, and competitive benchmarking all feed in. The output isn't a feature list — it's a learning agenda. Prioritized by impact and level of effort, not by what's easiest to build.

Every item on the roadmap has a hypothesis attached to it. If we can't articulate what we expect to happen and why, it doesn't belong on the list yet.

Outcome A validated, force-ranked roadmap built around experiments aimed at real business outcomes — with hypotheses attached to every bet.
04
Planning

Plan for reality.

Once the roadmap is set, we plan how to actually deliver it. Cross-functional teams align on what's genuinely possible — not what looks good in a presentation. Dependencies get surfaced. Blockers get named. Timelines and release definitions get set before anyone writes a line of code.

The plan is a living document. It reshapes as new information arrives. Features move based on learnings. The goal is a scope that's honest about constraints and committed to milestones that mean something.

Outcome An aligned scope and timeline with clear milestones, release definitions, and named owners for every dependency.
05
Measurement

Measure everything.

Every rollout has a metric. I build reporting cadences that keep the full organization — stakeholders and executives included — current on how each experiment is tracking. Dashboards, weekly reports, steering committees, and demos. Transparency is non-negotiable because accountability requires it.

Monitoring isn't passive. It's how we decide what to do next. If the numbers are moving, we understand why. If they're not, we know that too — and we make a decision rather than letting the experiment drift.

Outcome The full organization knows how every experiment is tracking against its hypothesis, its metric, and its business goal.
06
Decision

Make the call.

When the experiment concludes, I make a data-informed call. Iterate based on what we learned and start the validation cycle again. Productionalize into a full rollout with hardened code and incremental expansion. Or kill it, deprecate the feature, and redirect the team's energy to what's next.

All three are valid outcomes. The only bad outcome is letting an inconclusive experiment linger on the roadmap because nobody wanted to make the call.

Outcome A clear decision on each bet — with the rationale documented, the next action defined, and an owner assigned.
07
Sustained Execution

Do it again.

The process is designed to repeat. Once the machinery is in place — the reporting, the cadences, the prioritization framework — it runs as a quarterly exercise. The heavy lifting of initial setup only happens once. After that, the work is communication, data analysis, and staying honest about what's actually working.

Organizations that compound don't reinvent their process every year. They refine it. This framework is built to get faster, more accurate, and more aligned the longer it runs.

Outcome A self-sustaining process that compounds over time — generating better decisions, faster, with less friction every cycle.
Want to run this process in your organization?

Every engagement starts the same way: 30 minutes, no deck, just an honest conversation about where you are.

Work with me