A field guide to pretending responsibly

The Seduction of Certainty

Forecasts are guesses with formatting, but we believe otherwise, all together.

During the pandemic, I built a company forecast that felt bulletproof. Revenue ramp, hiring plan, committed spend, even a billing model you could toggle between invoice-level detail and pooled AR. In a time when control was an obvious illusion, we convinced ourselves this spreadsheet might be different. But it unraveled as customers froze budgets and stopped working normally, like everyone else.

Time marched forward, and reasonable assumptions met an unreasonable world. Customers paused, timelines slipped, and the tidy logic (each cell feeding the next) became suggestion rather than signal.

Forecasts are polite fictions. Not malicious lies, but carefully structured guesses we agree to believe temporarily. They break gently, almost apologetically, as reality intrudes. Yet they give us language: a way to coordinate effort, signal priorities, and act instead of freezing. The illusion of control isn't useless. It's how we move forward.

What Forecasts Actually Do

A forecast is a story about the future that won't happen. That's not a flaw: forecasts are alignment tools, not prediction tools.

They excel at two things: coordinating beliefs about what's likely, possible, and important; and creating permission structures for decisions. Exactly what gets funded, who gets hired, where bets are placed. This is their real value. Not accuracy, but alignment.

An obsession with precision misses what forecasts actually do. Teams agonize over getting the numbers exactly right (47% growth or 52%?) when they're optimizing for the wrong thing entirely. What matters is whether the exercise moved people in the same direction and whether the decisions it enabled made sense when the future arrived differently than planned.

The Theater of Numbers

Forecasts don't predict the future. They perform it.

Decimals signal confidence, trendlines suggest inevitability, charts pretend the world holds still long enough to measure. But the creeping doubt lingers anyway, the sense that something fundamental shifted since you started building this thing. The precision is theater. Everyone knows the foundation is shaky yet acts like it's solid ground.

Behind every spreadsheet lurks something messier: narrative instinct, gut feeling, politics. The curve's shape often reflects what someone wants to be true more than what's likely to happen. But we participate in the performance because forecasts are useful, not because they're true.

Inside companies, they justify resource allocation and strategic bets. Outside, they reassure boards, investors, and partners who need uncertainty translated into cold numbers. The CEO presents with conviction, finance defends with logic, and teams plan with purpose.

The real work happens in the negotiation: which version of the future is tolerable to disagree about? They give teams just enough shared ground to make real decisions, even when everyone suspects the numbers won't hold.

The Danger of Drinking Your Own Kool-Aid

The real risk isn't deceiving others. It's when you start believing it yourself.

It begins innocently enough: you present the model, explain assumptions, acknowledge risks with all the right caveats. But repetition calcifies fiction into fact, and the turning point only comes when someone asks 'What if we're wrong?' and the room goes quiet. You've walked through the numbers so many times that the spreadsheet becomes reality. The danger arrives when 'the model says' becomes more important than 'the customer says.'

Then hiring expands, roadmaps stretch, and bets multiply based on the model rather than what you’re actually seeing. You miss the moment when course correction was still cheap because admitting the forecast was wrong would mean questioning every subsequent decision. Inertia takes over.

The forecast hardens into dogma. Acknowledging drift feels like failure, so you keep walking toward the cliff while consulting the chart that promised flat ground.

Which is how teams walk off cliffs while reading maps. Smart teams abandon the model when necessary and recheck the terrain. Those who don't risk catastrophe while still citing supporting data.

Forecasting Responsibly

A forecast is an orientation tool. Nothing more. Not a prediction or promise, just a structured way to say: "If the world looks roughly like this, here's what we'll do."

Useful forecasts acknowledge uncertainty explicitly. They're built as ranges, not points. Not 47% growth, but 40-55% depending on retention, hiring, and conversion. Each number carries conditions, not confidence. Levers, not certainties. The difference between base hits and home runs: inches on the bat.

Strong teams version their plans: stretch case, base case, worst case, and the one quietly labeled "Oh No." This isn't pessimism. It's preparation. The goal is knowing what triggers each path and naming it before you're forced to choose.

Forecasts go stale without feedback. Regular check-ins matter as much as initial assumptions. Mid-quarter, something always changes: market shifts, customer losses, failed hires. Update cadence keeps forecasts alive; neglect turns them into stale stories.

The key is disciplined disbelief. Trusting just enough to move, never enough to sleep.

The Right Kind of Lie

Forecasts coordinate teams, create permission structures, and help tell stories about what's possible. Those stories don't need perfection, just shared understanding.

Building forecasts isn't hard. What matters is what follows: how often you check, how honestly you revise, how quickly you respond when cracks appear. Too many teams mistake motion for belief, building plans and running without looking up.

A forecast's value isn't accuracy. It's knowing when you're wrong and adjusting before the cost becomes unbearable.

The tension never resolves cleanly. It’s permanent. You'll always be building models you don't fully believe, presenting numbers with more confidence than they deserve, and making real decisions based on careful guesses. Discomfort doesn’t go away. The work is learning to walk through it, unbothered.

The best forecasting teams develop a kind of double vision: serious enough about the model to act on it, skeptical enough to question it constantly. They plan as if the forecast is right and prepare as if it's wrong. It’s the only honest way to navigate uncertainty without breaking under contradiction.

You already know the forecast is wrong. So what are you using it for — and is it helping you?

Accuracy is optional. Adjustment is not.

Pig Island, Exuma, Bahamas
Pig Island, Exuma, Bahamas