1. The Energy Tax Modern AI data centers waste the majority of their electricity on data movement, not computation. Because processors and memory are separate, data must be shuttled constantly between them, generating immense heat.
The technology to fuse them exists, but because there is no open "handshake" for this new architecture, efficiency has become a structural advantage that only the largest companies can build—and that others cannot buy their way into.
2. The Physical Moat Lock-in is moving from the software layer to the physical substrate. Firms building the largest AI systems—such as NVIDIA with its GB200 NVLink ecosystem or Google with its proprietary TPU cooling manifolds—are designing "vertical stacks" where the chip, cooling system, and power delivery are co-dependent.
If a data center is built to these proprietary specifications, it becomes economically impossible to "plug in" a competitor’s chip without rebuilding the facility. This is lock-in at the level of physics.