Our team at Dellecod has been thinking a lot lately about how the best consumer products are built — not just by clever engineering, but by aligning deeply with compounding forces that shape the evolution of technology.
Listening to Chris Dixon recently talk about exponential forces and their centrality in startups resonated with that ongoing theme. In his framing, Moore’s Law, composability, and network effects aren’t buzzwords, they’re the gravitational pull behind most meaningful modern breakthroughs. The difference between a one-off tool and a generational platform often comes down to whether you surfed one of these waves or paddled endlessly in still water.
Moore’s Law is, of course, the engine. It’s been quietly working in the background for decades, doubling the capabilities of semiconductors every couple of years. AI only feels magical because there’s been 50 years of accumulated progress under the hood. But Moore’s Law is just the raw horsepower. For that power to translate into something durable or socially valuable, you need the two other elements Dixon outlines: reusable building blocks and flywheel-style growth loops.
Composability is underrated here. Open source software — from Linux to PyTorch — doesn’t just lower costs. It rewires incentives. If you can remix or build on others’ work, your product cycle accelerates. Innovation becomes more of a group sport. The future starts to look modular, like Lego bricks instead of marble statues. As Dixon points out, small teams aren’t just surviving, they’re thriving — building $100M businesses with fewer than ten people. Almost every exciting new tool we see today, especially in AI, is standing on a scaffold of prior knowledge and open protocols.
Then there’s network effects, which are harder to engineer but massively powerful when they click. The shape of success in consumer products hasn’t changed much since the early days of Facebook — grow the network, align incentives, build stickiness. Still, we’ve only scratched the surface of applying those patterns to today’s tools. As Dixon says, most AI products right now are still in single-player mode.
Take image generators or AI writing assistants. They’re useful, sure. But they don’t yet have the compounding advantages of a scaled social network. You don’t stick around because your friends are there, or because your history creates increasing personalization, or because the community enriches the output. You come for the tool — and leave after the task.
We keep asking ourselves: what does the AI-native network look like? Not a copy of Twitter or Facebook with an LLM bolted on — but something designed around collaboration with AI from the start. Maybe it involves persistent agents. Maybe it’s shared memory. Maybe it’s interfaces that feel more ambient than prompted. But the point is: skeuomorphism won’t get us there. Like Jobs applying leather stitching to early iOS — these metaphors are bridges, not destinations.
The idea of skeuomorphic vs. native experiences is quietly profound. In every technological shift, we seem to start by making new things look like old things. Netflix mailed DVDs. The first websites looked like magazines. The earliest mobile apps had tiny skeuomorphic buttons or visual curls in their corners. Eventually, people get comfortable enough for native forms to emerge. That’s what’s missing in AI today. We’re typing at supercomputers like they’re command lines. That will change.
Related to this is the importance of timing. One thing the conversation made clear is how tightly timing is tied to platform shifts. Instagram took off not just because it was photogenic, but because mobile camera quality and bandwidth had reached a tipping point. Figma thrived because browser compute became viable for collaborative design. What looks like a clever UI or a novel business model is often just being in the right place when an enabling tech quietly becomes good enough.
It’s also a reminder of the barbell model we’re starting to see. On one end are companies raising billions to train frontier models. That will likely persist. But on the other end, small teams are crafting remarkable things using off-the-shelf open-source models, or building on rich APIs. Sometimes there’s a single founder running a software business with a $100M run rate. These aren’t flukes. They’re signals.
We’re just beginning to untangle the economics too. If people are already paying $250–$300 per month for personal or business-grade AI tools, software might formally take its seat behind rent and food in the average consumer’s monthly spend. That opens up more space for high-trust, value-dense, premium offerings — where pricing isn’t just supported, it’s expected.
Which leads into movements and communities. Almost every big wave — from crypto to VR to vibe coding — starts with a strange, passionate group. These are often places with dense context, terse language, maybe some memes or in-jokes. But the activity is real. They build, launch, spin up GitHub orgs, self-organize. When we see such communities forming, we try to pay close attention. These are often the early chapters of what becomes the next breakout market. Stack Overflow started that way. So did Substack and Pinterest. Watch the edges.
Finally, there’s open source. Dixon makes a strong case for why it matters not just technically or politically, but economically. Open source can keep the floor of innovation accessible. If regulations go too far and push liability onto open projects, we risk entrenching centralized control and weakening one of the few counterweights to capital concentration. In our own path building software, we’ve learned that coupling a strong open source backbone with considered tooling helps us move faster — and invites contribution from contributors we otherwise couldn't afford to hire.
So, if you zoom out, the outline is surprisingly clear. The biggest leverage comes from picking the right exponential forces — Moore’s Law, composability, networks — and designing for their dynamics. It means resisting the urge to simply port old metaphors into new domains. It means treating niche communities not as margins but as previews. And it means remembering that product alone doesn’t win — timing, story, and structure matter just as much.
In an age of rapid acceleration, it’s easy to chase the next shiny app or viral demo. But it’s worth pausing to ask: which forces is this product actually riding? What is compounding here? What’s defensible in the long run?
At Dellecod, we don’t have all the answers. But these are the right questions. And in a field shaped by constant reinvention, just asking them — thoughtfully and often — makes all the difference.
Listening to Chris Dixon recently talk about exponential forces and their centrality in startups resonated with that ongoing theme. In his framing, Moore’s Law, composability, and network effects aren’t buzzwords, they’re the gravitational pull behind most meaningful modern breakthroughs. The difference between a one-off tool and a generational platform often comes down to whether you surfed one of these waves or paddled endlessly in still water.
Moore’s Law is, of course, the engine. It’s been quietly working in the background for decades, doubling the capabilities of semiconductors every couple of years. AI only feels magical because there’s been 50 years of accumulated progress under the hood. But Moore’s Law is just the raw horsepower. For that power to translate into something durable or socially valuable, you need the two other elements Dixon outlines: reusable building blocks and flywheel-style growth loops.
Composability is underrated here. Open source software — from Linux to PyTorch — doesn’t just lower costs. It rewires incentives. If you can remix or build on others’ work, your product cycle accelerates. Innovation becomes more of a group sport. The future starts to look modular, like Lego bricks instead of marble statues. As Dixon points out, small teams aren’t just surviving, they’re thriving — building $100M businesses with fewer than ten people. Almost every exciting new tool we see today, especially in AI, is standing on a scaffold of prior knowledge and open protocols.
Then there’s network effects, which are harder to engineer but massively powerful when they click. The shape of success in consumer products hasn’t changed much since the early days of Facebook — grow the network, align incentives, build stickiness. Still, we’ve only scratched the surface of applying those patterns to today’s tools. As Dixon says, most AI products right now are still in single-player mode.
Take image generators or AI writing assistants. They’re useful, sure. But they don’t yet have the compounding advantages of a scaled social network. You don’t stick around because your friends are there, or because your history creates increasing personalization, or because the community enriches the output. You come for the tool — and leave after the task.
We keep asking ourselves: what does the AI-native network look like? Not a copy of Twitter or Facebook with an LLM bolted on — but something designed around collaboration with AI from the start. Maybe it involves persistent agents. Maybe it’s shared memory. Maybe it’s interfaces that feel more ambient than prompted. But the point is: skeuomorphism won’t get us there. Like Jobs applying leather stitching to early iOS — these metaphors are bridges, not destinations.
The idea of skeuomorphic vs. native experiences is quietly profound. In every technological shift, we seem to start by making new things look like old things. Netflix mailed DVDs. The first websites looked like magazines. The earliest mobile apps had tiny skeuomorphic buttons or visual curls in their corners. Eventually, people get comfortable enough for native forms to emerge. That’s what’s missing in AI today. We’re typing at supercomputers like they’re command lines. That will change.
Related to this is the importance of timing. One thing the conversation made clear is how tightly timing is tied to platform shifts. Instagram took off not just because it was photogenic, but because mobile camera quality and bandwidth had reached a tipping point. Figma thrived because browser compute became viable for collaborative design. What looks like a clever UI or a novel business model is often just being in the right place when an enabling tech quietly becomes good enough.
It’s also a reminder of the barbell model we’re starting to see. On one end are companies raising billions to train frontier models. That will likely persist. But on the other end, small teams are crafting remarkable things using off-the-shelf open-source models, or building on rich APIs. Sometimes there’s a single founder running a software business with a $100M run rate. These aren’t flukes. They’re signals.
We’re just beginning to untangle the economics too. If people are already paying $250–$300 per month for personal or business-grade AI tools, software might formally take its seat behind rent and food in the average consumer’s monthly spend. That opens up more space for high-trust, value-dense, premium offerings — where pricing isn’t just supported, it’s expected.
Which leads into movements and communities. Almost every big wave — from crypto to VR to vibe coding — starts with a strange, passionate group. These are often places with dense context, terse language, maybe some memes or in-jokes. But the activity is real. They build, launch, spin up GitHub orgs, self-organize. When we see such communities forming, we try to pay close attention. These are often the early chapters of what becomes the next breakout market. Stack Overflow started that way. So did Substack and Pinterest. Watch the edges.
Finally, there’s open source. Dixon makes a strong case for why it matters not just technically or politically, but economically. Open source can keep the floor of innovation accessible. If regulations go too far and push liability onto open projects, we risk entrenching centralized control and weakening one of the few counterweights to capital concentration. In our own path building software, we’ve learned that coupling a strong open source backbone with considered tooling helps us move faster — and invites contribution from contributors we otherwise couldn't afford to hire.
So, if you zoom out, the outline is surprisingly clear. The biggest leverage comes from picking the right exponential forces — Moore’s Law, composability, networks — and designing for their dynamics. It means resisting the urge to simply port old metaphors into new domains. It means treating niche communities not as margins but as previews. And it means remembering that product alone doesn’t win — timing, story, and structure matter just as much.
In an age of rapid acceleration, it’s easy to chase the next shiny app or viral demo. But it’s worth pausing to ask: which forces is this product actually riding? What is compounding here? What’s defensible in the long run?
At Dellecod, we don’t have all the answers. But these are the right questions. And in a field shaped by constant reinvention, just asking them — thoughtfully and often — makes all the difference.