The Spreadsheet Error That Almost Ended a Startup (And What It Teaches About Data)
Adam Biddlecombe, Founder / Head of Brand at HubSpot at Mindstream
It was February 2024. Mindstream had just acquired AutoGPT, a rival newsletter with 170,000 subscribers. The deal was audacious—they traded 30% equity when Mindstream was still under 10,000 subscribers. Co-founder Matt had taken a huge leap to join the newly merged company. Everything felt precarious and urgent.
So they built the spreadsheet.
This wasn’t a dashboard. It was the master brain of the business—revenue, subscribers, churn rates, acquisition costs, all modeled out month by month. They spent hours and hours on it, running scenarios, trying to understand if the merged entity could work.
The number at the end was catastrophic.
“The number at the end was just telling us that basically the business was eating itself. It was just destined to die,” Adam recalls. “It was really dark.”
They’d just persuaded Matt to leave his company and join them. They’d just bought his newsletter. And now the data—their own careful analysis—was telling them they’d made a terrible mistake. The math didn’t work. The business wasn’t viable. Everything was broken.
The Darkest Day
What happened next was human. They got food. They had a few too many drinks. They sat in the weight of the data they’d created, the data that felt like destiny.
The next day, with a hangover and fresh eyes, they pulled the spreadsheet up again.
The formulas were wrong.
Not the data. Not the business model. Not the strategy. The formulas. They’d built the entire narrative of failure on faulty Excel logic. When they fixed the equations, “everything that was red went green.”
“And it’s so, you know, it’s, it was just like the biggest sense of euphoria I’ve ever felt. It was absolutely wild. How, how good we felt,” Adam said. Two months after that dark moment of believing everything was over, they started talking to HubSpot about an acquisition.
What the Spreadsheet Error Actually Teaches
The obvious lesson is “check your formulas.” But the real lesson is more subtle, and it’s about how founders fall in love with data narratives.
The moment that spreadsheet showed the business was failing, everyone accepted it. Not because they verified the logic—they accepted it because it confirmed their anxiety. They’d made a bold move (trading 30% equity). They’d asked someone to take a risk. The data was permission to panic.
Data has authority. When the numbers show red, it feels like truth. When the spreadsheet says you’re doomed, it feels like destiny, not calculation.
But data is only as good as the assumptions feeding it. And when you’re running a formula through a complex sheet with dozens of cells and dependencies, the chance of a formula error compounds. One bad formula early can make the entire output meaningless.
The risk is that you make a major decision (shutting down, pivoting, giving up) based on bad data—and by the time you realize the data was wrong, you’ve already destroyed something real.
The Emotional Math of Bad Data
What’s interesting is that the team didn’t immediately spiral into action. They didn’t panic-sell or shut down the company that night. They got drunk. They sat with it. They lived with the dark story for a full night.
By the next morning, something had shifted. Maybe it was perspective. Maybe it was the break from the spreadsheet. But they had the presence of mind to check the formulas instead of immediately believing the narrative.
Adam mentions another principle he learned from Shawn Puri: when things feel like disasters, they rarely are. And when things feel like breakthroughs, they often aren’t.
“When you think everything’s screwed, it’s not as bad as you think it is. And when you think everything’s great, it’s not as good as you think it is.”
Shawn keeps a Slack channel at his company where people log extreme moments—disasters and wins. At the end of every month, they review them. Almost every crisis from the beginning of the month looks overstated. Almost every major win from week one looks less significant by month-end. Reality has a way of averaging back to the middle.
How to Build Better Data Habits
This doesn’t mean ignoring data. It means understanding that a spreadsheet is not the same as reality. It’s a model of reality, built on assumptions. If the assumptions are wrong, the output is fiction.
For Mindstream, the lesson became: don’t rely on a single model to tell you if the business works. Use the spreadsheet to ask questions, not to answer them.
Are we growing? Yes (separate, validated metric). Are we profitable? No, but growing (design choice, acceptable). Is churn sustainable? Unknown (requires more data). Do customers value us? Yes (proxy: sponsor inbound, team stability).
Each of these is backed by different data sources. No single spreadsheet decides the fate of the company.
The other lesson is psychological: if your data is telling you something shocking, don’t decide anything that day. Let it sit. Check the formulas. Ask a co-founder to review it. Sleep on it. The spreadsheet will still be there, and so will the real business underneath it.
The Broader Pattern: Data as Anxiety Management
Founders love spreadsheets because they create the illusion of control. If you can model the business, you understand it. If you understand it, you can predict it. If you can predict it, you can control it.
But the spreadsheet is a tool, not a oracle. Adam’s experience is common: founders build complex models, the model shows something scary, and they immediately believe it. But the scary thing is often just a formula error or a faulty assumption (like churn modeling that doesn’t account for network effects, or acquisition costs that include one-time expenses).
The best founder instinct isn’t to trust the spreadsheet. It’s to understand the spreadsheet as a question-asking tool, not a truth-telling tool. Ask: “What would have to be true for this model to be correct?” and “Do those assumptions match reality?”
FAQ
How do you know if a spreadsheet error is a red flag for other problems?
Sometimes. If your formulas are sloppy, maybe your thinking is sloppy too. But more often, a formula error is just a formula error—a misplaced parenthesis, a copy-paste mistake, or a misaligned row. Review the logic rigorously, but don’t let one error make you doubt your entire strategy. Mindstream’s core model (email + sponsorships + owned audience) was sound; the implementation had a bug.
Should you rebuild complex spreadsheets regularly to catch errors?
Yes, periodically. But more importantly, build models so a friend can understand them. If you can’t explain your spreadsheet’s logic to someone else, it’s too complex. Adam’s Mindstream model tracked clear metrics: subscriber counts, churn, acquisition costs. Those are straightforward. Complex nested formulas are where errors hide.
What metrics should founders actually track instead of relying on spreadsheets?
Track the inputs you control, not the outputs you don’t. Track: content quality (measured by engagement), audience growth rate (measured by signups), sponsor quality (measured by repeat sponsors), and cash position (actual bank account). Don’t track “eventual valuation” or “predicted profitability in month 18”—those require too many assumptions.
How do you validate a spreadsheet model before making a decision?
Back-test it against historical data. If your model predicts what actually happened last month, you have some confidence. If the model can’t explain the past, it can’t predict the future. Second, have someone else review the formulas independently. They’ll spot errors faster because they don’t have the same blind spots you have.
Should you make major decisions when you’re anxious about data?
No. The moment a spreadsheet makes you want to panic-decide, that’s a signal to pause. Talk to a co-founder or advisor. Sleep on it. Come back with fresh eyes. Most of Adam’s worst instincts came from reacting to data immediately. The best insights came after time and perspective.
What’s the difference between a red flag in your data and a sign you need better data?
A red flag is a pattern that’s consistent and repeatable. A sign you need better data is when something looks wrong but doesn’t match reality. Mindstream’s spreadsheet looked wrong (business is failing), but the reality was thriving (sponsors were inbound, subscribers were growing). The issue wasn’t the business—it was the model.
How do you avoid falling in love with a narrative that the data supports?
Make predictions and test them. If your spreadsheet predicts churn will be 15% next month, check. If it’s actually 8%, something in your model is wrong. Keep testing predictions against reality. The spreadsheet that’s right most of the time is worth keeping. The spreadsheet that’s often wrong is a false confidence.
When should you trust your gut over the spreadsheet?
When the spreadsheet contradicts what you’re observing in reality. Mindstream’s sponsors were inbound (reality). The spreadsheet said the business was dying (model). Trust reality. Use the spreadsheet to understand why reality differs from the model. Usually, the model is wrong, not the reality.
What’s the role of stress and emotion in misreading data?
Huge. Under stress, you look for certainty. Data feels certain. So you believe it more. Adam’s team believed the spreadsheet was destiny because they were anxious about the 30% equity trade. In a calmer moment, they might have reviewed the formulas before panicking. Recognize that stress makes you trust data more and question it less—which is backwards.
Should founders build their own spreadsheets or hire analysts?
Build your own early. You’ll understand the logic better and catch errors faster. As the company scales, hire someone, but make sure you still understand the model conceptually. The worst situation is outsourcing analysis to someone who also doesn’t understand the business well enough to catch formula errors.
Full episode coming soon
This conversation with Adam Biddlecombe is on its way. Check out other episodes in the meantime.
Visit the Channel