Strategy theater is easy to recognize once you've seen it. It arrives as a large deck. It contains a lot of frameworks — 2x2s, maturity models, competitive landscapes built from public information. It includes a set of recommendations that are technically correct and practically ambiguous. And it ends with a slide that says something like "next steps: align stakeholders and begin phase two."
Nobody in the room disagrees with it, because there's nothing specific enough to disagree with. Everyone nods. The consultant invoices. The deck gets filed. Three months later the organization is having the same conversation it was having before the engagement.
I know this pattern because I've watched it play out in organizations I've worked in. And I've been intentional about building a different approach — one that starts from a simple premise: every engagement should end with something real in the world. Not a recommendation that might become a thing. A thing.
What strategy theater actually costs
The invoice is the obvious cost. But the real cost of a theater engagement is what it displaces. The six to twelve weeks spent on a strategy process is six to twelve weeks not spent shipping. The executive attention given to reviewing decks is executive attention not given to making decisions. The organizational alignment built around a framework dissolves the moment the engagement ends and the framework isn't being actively maintained by someone with authority to enforce it.
There's also a subtler cost: learned helplessness. Organizations that go through multiple strategy engagements without seeing real change start to become cynical about strategy work in general. When the next genuine strategic need arises, the organization has been trained to expect theater — and that expectation poisons the engagement before it starts.
The measure of a strategy engagement isn't whether the deck is compelling. It's whether the organization behaves differently six months later — and whether that difference is traceable to a specific decision made during the engagement.
What a real engagement looks like
It starts with diagnosis, not frameworks. I want to understand what's actually broken before I propose how to fix it. That means talking to the people doing the work — not just the leadership team. Reading the user feedback. Walking the guest journey myself. Looking at the data before forming an opinion about what it means.
Diagnosis takes time. It's also where most of the value gets created, because the presenting problem is almost never the actual problem. A product with a low App Store rating isn't a rating problem — it's a specific set of failed user jobs that can be identified, prioritized, and fixed. A digital channel that isn't growing isn't a marketing problem — it's usually a conversion problem that lives somewhere in the experience between acquisition and checkout. Naming the right problem is half the work.
Why I work the way I do
I've held the seat. Not as a consultant looking in from outside, but as the person who had to execute against the strategy, manage the cross-functional relationships, and be accountable for the outcome when the feature shipped or didn't. That experience makes me permanently skeptical of recommendations that aren't accompanied by a clear path to execution.
It also means I know which parts of strategy work are genuinely hard and which parts are elaborate avoidance of the hard parts. Naming a priority is easy. Sequencing it against competing priorities, getting engineering alignment, and shipping it inside a franchise system with 2,200 locations — that's the actual work. I've done the actual work. I know what it requires. And I build my engagements around getting to that work as fast as possible.
This approach has a cost. It's less comfortable than strategy theater, because the recommendations are specific and the accountability is real. If I tell you the conversion problem lives in your offer redemption flow and we fix it together, we'll know in two release cycles whether I was right. That's a sharper edge than "consider optimizing the loyalty experience as a Q3 initiative."
I prefer the sharper edge. It's the only way the work means anything.
Every engagement ends with something real in the world. A shipped feature. A fixed conversion rate. A decision made that wasn't made before. If it doesn't, the engagement wasn't finished.
If you're looking for someone to validate a direction you've already chosen and produce a deck that makes it look rigorous — I'm probably not the right fit. If you need someone who will tell you what's actually broken, help you make the hard decisions, and stay in the room until the thing ships: let's talk.