Disclosure: The views and opinions expressed herein belong solely to the authors and do not represent the views and opinions of crypto.news editorials.
The Internet expanded because IP created a universal fabric of data. Web3 didn’t have that luxury. Instead, it inherits a patchwork of 1980s-era networks and ad-hoc protocols, not to mention billions of AI agents, a global payment layer, and a globally distributed physical infrastructure network sensor mesh that slows down and creates congestion when attempting to perform actual transactions at scale. We are long past the stage where faster chains and larger blocks are useful.
summary
Web3 cannot scale on fragmented and outdated networks. Achieving trustless global throughput requires a universal distributed data protocol (proprietary TCP/IP). Mathematical breakthroughs like RLNC show that decentralized networks can match centralized performance if data movement is redesigned from first principles. A universally coded data layer will unlock real scale, fix chain fragmentation, enable multi-trillion dollar DeFi, support the global DePIN network, and power decentralized AI.
Web3 requires its own TCP/IP moment. Decentralized internet protocols are built on the principles that made the original Internet unstoppable, but are designed to maintain the core values of blockchain: trustlessness, resistance to censorship, and permissionless participation that will ultimately be carried out at scale.
What the industry continues to lack
Before IP, computers could not communicate across networks. IP created a universal standard for routing data between any two points on Earth, turning an isolated system into the Internet. It has become one of the three pillars of Internet infrastructure (along with compute and storage). All Web2 applications run over TCP/IP. This is the protocol that made planet-wide communications possible.
Web3 is repeating the same early mistakes. Every blockchain has invented its own network layer, such as Gossip Protocol, Turbine, Snow, Narwhal, mempool, DA sampling, etc. None of them are universal and are unnecessarily restrictive. Everyone is chasing speed with bigger blocks, more rollups, and more parallelism. But they all use a fundamentally broken network model.
If you’re serious about extending Web3, you need an Internet protocol that is reliably fast, trustless, fault-tolerant, and most importantly, modular.
20 years at MIT solving decentralization’s toughest problems
For more than 20 years, my research at MIT has focused on one question. The question is, can a decentralized system move information as quickly and reliably as a centralized system, and can you prove it mathematically?
To answer that, we combined two fields that have rarely intersected: network coding theory, which mathematically optimizes the movement of data, and distributed algorithms, guided by Nancy Lynch’s seminal work on consensus and Byzantine fault tolerance.
What we discovered was clear. Distributed systems can reach centralized levels of performance, but only if data movement is redesigned from first principles. After years of proof and experimentation, Random Linear Network Coding (RLNC) has emerged as the mathematically best way to do this across distributed networks.
Once blockchain was introduced, its applications became clear. Our internet is built for trusted intermediaries. The decentralized web requires unique protocols designed to withstand failures and attacks while scaling globally. The architectural changes are:
Performance comes from math, not hardware. Coordination is done from the code, not the server. And the more decentralized the network, the more powerful it becomes.
Like the original Internet protocols, it is not intended to replace what already exists, but to enable what comes next.
Use cases that disrupt today’s infrastructure
Decentralized systems are reaching their limits at the exact moment the world needs them to scale. Four macro trends have emerged, each exposing the same bottlenecks. Web3 still runs on network assumptions inherited from centralized systems.
1. L1 and L2 fragmentation means the blockchain scales locally but fails globally
There are currently over 100 blockchains, each capable of optimizing its own local execution, but the moment these networks need to coordinate globally, they all face the same challenges. Data movement is limited, inefficient, and fundamentally suboptimal.
What blockchain lacks is something comparable to a power grid, a shared layer that routes bandwidth to where it is needed. Decentralized internet protocols allow all chains to access the same coded data fabric, speeding up block propagation, DA retrieval, and state access without touching consensus. And like any good grid, when it works, congestion will be kept to a minimum.
2. Tokenization and DeFi in the trillion dollar market
DeFi cannot settle trillions of dollars in networks that propagate slowly, collapse under load, or where RPC bottlenecks concentrate access. If multiple chains are connected in a shared encoding network, propagation spikes are unlikely to overwhelm a single chain, but will be absorbed and redistributed throughout the network.
Traditional systems build large data centers to absorb peak loads. These are expensive and lead to single points of failure. In a decentralized system, you cannot rely on megacenters. You have to rely on coded distribution.
3. DePIN on a global scale
A global network of millions of devices and autonomous machines cannot function if each node is waiting for slow, single-path communication. These devices must behave like a single coherent organism.
In the energy system, a flexible grid absorbs both a commercial mining operation and a single hair dryer. In networking, distributed protocols must do the same for data. This means that all sources must be optimally absorbed and delivered where they are needed most. It requires coded storage, coded retrieval, and the ability to utilize all available paths rather than relying on a few predetermined paths.
4. Decentralized AI
Decentralized AI relies on high-throughput, fault-tolerant data movement, whether training on encrypted fragments or coordinating fleets of AI agents. Currently, distributed storage and computing are separated. Access is slow. Retrieval relies on a central gateway. AI requires data logistics, not simple storage. That is, data is encoded on the move, stored in coded fragments, retrieved from the fastest location at the time, and instantly recombined without relying on a single location.
The next leap forward for Web3
Every great leap forward in the evolution of the Internet began with a breakthrough in the way data moves. Global connectivity via IP has been achieved. Broadband-enabled Netflix and cloud computing. 4G and 5G have made Uber, TikTok, and real-time social possible. GPUs have sparked the deep learning revolution. Smart contracts have enabled programmable finance.
A universal coded data layer will do for blockchain what IP did for the early internet, creating conditions for applications we cannot yet imagine. This is the foundation that turns Web3 from experimental to inevitable.
