Sometimes in software, it feels like the hardest part is separating signal from noise. Headlines claim “AI is the new electricity,” while others warn of a crash that will rival the dot-com bust. At Dellecod, we’ve been watching this space closely — not with a trader’s mindset, but with an operator’s eye. And conversations like the recent one between Gavin Baker and David George offer a much-needed anchor in an otherwise dizzying market.
The core question they tackle — are we in an AI bubble? — is worth asking, especially for those of us building in this space. But Baker’s answer is refreshingly grounded: no, this isn’t like 2000. And more importantly, here’s why.
Back then, nearly all of the physical infrastructure laid during the telecom boom went unused. Companies overbuilt with no clear plan toward utility or monetization. “Dark fiber” became the symbol of overhyped ambition.
Today’s AI infrastructure tells a different story. Demand isn’t speculative — it’s screamingly real. GPUs are running day and night, often to the edge of thermal limits. The phrase “no dark GPUs” might sound like a quip, but it’s revealing. It captures how this wave is grounded in usage, not hype. Token throughput is scaling up at rates that most of us couldn’t have imagined even a year ago — Google reported a 150x jump in token processing over just 17 months. That’s not a science project. That’s a live user base.
What’s perhaps most striking about this cycle is the discipline of the capital behind it. Take the estimated $3–4 trillion coming into data center infrastructure over the next five years. That’s not venture froth — it’s strategic reinvestment from companies generating hundreds of billions in free cash flow annually. These are well-capitalized incumbents like Google, Amazon, and Meta, who aren’t betting for bragging rights. They’re shoring up technical moats that they’ll defend with massive war chests.
Yes, there are pockets of questionable behavior. Some AI startups are effectively funded just to buy chips from their investors — the so-called roundtripping problem. But scale, not scheme, seems to be the defining trend. And unlike in 2000, the tech giants funding these AI plays are already sustaining double-digit ROIC on these investments — a far cry from the “clicks and eyeballs” logic of the dot-com era.
There’s also an important shift in how we understand software margins. AI SaaS products tend to have lower gross margins than traditional SaaS, largely due to compute intensity. At first glance, that looks like weakness. But as Baker and George point out, lower margins in this context often signal real product usage. Models are working hard. APIs are being called millions of times per second. We shouldn’t shy away from that.
It also fits with what we see at Dellecod. There's a certain honesty in usage-based wear. Not all bits are elegant. Sometimes the highest value comes from the code that gets hit the most, not the code that wins design awards. And in AI, performance often rides the curve of infrastructure — compute, storage, utilization — not just elegance of abstraction.
Hardware players like NVIDIA are seeing enormous returns, not only because they made the best chips — but because they built a sticky vertically integrated stack all the way through to software. Google’s TPUs are a real threat here, especially now that they’re folding into an increasingly polished product ecosystem. AWS and Microsoft aren’t sitting still either, each leveraging decades of infrastructure learnings.
But this isn’t just an infrastructure or capex story. The consumer layer is quietly transforming. It’s easy to project that personal AIs — think Grok, Gemini, or what emerges from xAI — could soon act as intermediaries for everything from search to scheduling. The old affiliate model may find new life as these agents monetize through outcomes, not impressions. If a bot books your trip, its cut will come from the completion, not the click. That changes how we think about user experience, and it upends entire advertising assumptions.
We were particularly struck by the emphasis on robotics near the end of the conversation. For years, the path from virtual AI to physical automation felt distant — always “a few years out.” Now, Tesla’s humanoid concept and the rise of general-purpose robotics seem less like moonshots and more like inevitabilities. If humanoid form factors can meaningfully capture labor tasks — and the economics start to make sense — whole categories of physical work may get rewritten.
Of course, timelines are slippery. Prediction is always part data, part belief. But when someone like Andrej Karpathy says AGI inside of 10 years is conservative, it’s worth listening. And even if we stay short of AGI, the compound effect of scaling, reinforcement learning, and personalization will bring major shifts sooner than most people are prepared for.
What we appreciated most in Baker and George’s take is the clear sense that risk and opportunity are in healthy balance. Valuations aren’t cheap, but they’re far from the madness of 2000 — especially when you factor in earning power and operational efficiency. There’s real heat under the hood here.
The investments being made — trillions into energy, compute, and infrastructure — are long-dated bets. You don’t build like this unless you think the future needs it. And right now, all signs point to an AI-powered world that’s not only plausible, but already landing.
For us, that’s both exciting and sobering. The promise is real. But so is the responsibility — to design and deploy thoughtfully, to ignore hype cycles, and to stay focused on what’s actually useful to people.
Because if the last AI cycle was about lab demos, this one’s about delivery. And in delivery, fundamentals matter. Compute matters. Cash flows matter. Users matter.