The biggest satisficing trap AI breaks isn't what you think
Matt Shumer is right that AI is about to change everything. But the real transformation isn't about job loss -- it's about discovering that most of the limits you've been working within were never real.
Matt Shumer recently published an article called Something big is happening in AI — and most people will be blindsided. If you haven’t read it, you should. He’s a startup founder and investor, and his description of what current AI models can do is accurate. When he says the ground is shifting under knowledge work, he’s not exaggerating.
But his article misses something important. Not the facts — he has those right. The most important thing happening with AI isn’t the thing most people are focused on.
The Conversation Everyone Is Having
Shumer’s article, like most AI commentary right now, centers on a question: What will AI take away? Jobs, roles, career paths, entire industries. The tone is urgent and the evidence is real. He describes telling AI what to build, walking away for four hours, and coming back to finished work. I’ve had the same experience. It’s not hypothetical.
The natural response to that story is anxiety. If AI can do what I do, what happens to me?
It’s the wrong question.
The Question Almost Nobody Is Asking
Here’s what’s actually happening.
For my entire career, I’ve carried around a set of assumptions about what one person can do. What’s realistic for a small business. What requires hiring someone. What requires buying a SaaS subscription. What requires a team. These assumptions felt like facts. They were built from decades of experience, and they were mostly right — given the tools that existed at the time.
AI coding agents didn’t just give me a faster way to work. They dissolved the assumptions themselves.
I’ll give you a concrete example. For years, I paid for a SaaS email service. It never occurred to me that I could build and run my own email archive. That wasn’t a decision I made after careful analysis. It was an assumption I never examined. Of course you don’t build your own email infrastructure. That’s what services are for.
Then one morning I asked Claude Code to build me an email archive system. It took a few hours. It works. It runs. I now have a capability I never thought I could own. And the moment I had it, I realized the limitation had never been technical. It had been in my thinking.
That same pattern has repeated itself over and over. Quarterly tax filings I assumed required tedious manual work — now automated with playbooks. Website analytics I assumed required an SEO consultant — now a daily brief generated automatically. Data trapped in SaaS exports I assumed was inaccessible — now imported with tools built in fifteen minutes. Each time, the real barrier wasn’t skill or resources. It was an unexamined belief about what was possible.
Why This Matters More Than Job Loss
Shumer frames AI primarily as a threat to jobs, and secondarily as an opportunity. That framing is backwards.
The threat is real. But the much bigger story is this: most of the constraints that professionals and businesses operate under are self-imposed. They’re assumptions inherited from a world where building things was expensive, slow, and required specialized teams. In that world, it made perfect sense to buy rather than build, to outsource rather than own, to accept limitations rather than challenge them.
That world ended. Most people just haven’t noticed yet.
When Shumer tells you to “spend one hour a day experimenting with AI,” he’s right. But he doesn’t go far enough. An hour a day isn’t going to cut it. Don’t just experiment. Pick a real problem in your business or your work — something you’ve always assumed requires a vendor, a specialist, or a team — and try to solve it yourself with an AI coding agent. Not as an exercise. For real.
You will probably succeed. And when you do, pay attention to the feeling. It won’t just be satisfaction at having solved the problem. It will be the disorienting realization that you could have done this all along — if you’d known to question the assumption.
If you do this and get stuck, reach out. I’ll pick up the phone and talk with you.
Experience Is an Asset, Not a Liability
There’s an undercurrent in Shumer’s piece, and in a lot of AI commentary, that seniority is a disadvantage. The young and the adaptable will thrive. The experienced will be left behind.
This is wrong.
I’m 66. I’ve been writing software for over thirty years. And I’m more capable today than I was five years ago. Not because I’m smarter. Because AI coding agents multiply the value of something I’ve spent decades developing: the ability to think in systems, to see how pieces connect, to describe precisely what needs to happen and evaluate whether it worked.
The ability to clearly describe a problem and evaluate a solution isn’t something AI replaces. It’s the skill AI amplifies most. People with deep experience — people who understand how systems actually work, who’ve seen what happens when things fail, who know what good looks like — are the people best positioned to direct these tools effectively.
The young Stanford grad and the 66-year-old systems engineer are both going to use AI. But the person who’s been solving complex problems for decades brings context, judgment, and pattern recognition that takes years to develop. AI doesn’t make that less valuable. It makes it more valuable, because now that judgment can be applied at a speed and scale that wasn’t previously possible.
Urgency, Not Panic
Shumer opens his article with a Covid analogy. He wants people to take this seriously, and they should. But the analogy does real harm.
Covid was something that happened to people. You couldn’t outrun it. You couldn’t build your way through it. All you could do was hunker down and wait.
AI is the opposite. It’s something you can work with. The people who engage with it actively, who use it to build capabilities and solve real problems, aren’t just surviving the disruption. They’re using it to become more capable, more independent, and more resilient.
The right posture isn’t anxiety. It’s agency.
Help Me Check My Thinking
I’m one person running a small business. My experience won’t generalize to every situation. Shumer is sounding an alarm for hundreds of millions of knowledge workers, and for many of them, the disruption will be painful and disorienting.
But the single most important thing AI does is not replace human work. It reveals that many of the limits we’ve been living within were assumptions, not facts. Once you see that, the world looks very different. Not scarier. Bigger.
I’d like to know: does this match your experience? Am I seeing something real, or am I too close to my own situation to see clearly? If you have a reaction — agreement, disagreement, a story of your own — I’d welcome the conversation.
You can reach me at common-sense.com/contact.