The Meta-Broadcom announcement’s numbers have an odd weight. By 2027, the computing capacity will increase from one gigawatt to several gigawatts. In the first phase alone, there was enough electricity to power about 750,000 American homes. And this is only the internal AI plan of one company. It is not a national grid. Not a city. Instagram, WhatsApp, Threads, and whatever Mark Zuckerberg refers to as “personal superintelligence” are all served by a single set of data centers.
On the surface, the deal itself seems fairly simple. By extending their collaboration through 2029, Meta and Broadcom will construct several generations of unique MTIA accelerators, the first of which will use a 2-nanometer process. Hock Tan is leaving Meta’s board and taking on an advisory position. In after-hours trading, Broadcom’s stock increased by 3%, while Meta’s shares remained unchanged. In other words, the market shrugged because another multibillion-dollar AI compute deal hardly counts as news at this point.
| Detail | Information |
|---|---|
| Deal Announced | April 14, 2026 |
| Companies Involved | Meta Platforms and Broadcom |
| Deal Duration | Extended through 2029 |
| Initial Compute Commitment | 1 gigawatt (enough for ~750,000 US homes) |
| Long-term Plan | Multi-gigawatt rollout into 2027 and beyond |
| Chip Name | MTIA (Meta Training and Inference Accelerator) |
| Manufacturing Process | Industry’s first 2-nanometer AI compute accelerator |
| Meta CEO | Mark Zuckerberg |
| Broadcom CEO | Hock Tan (stepping off Meta board, moving to advisory role) |
| Meta’s 2025 AI Infrastructure Spend | $135 billion |
| Broader Industry Context | Hyperscalers reducing reliance on Nvidia GPUs |
| Related Deal | Broadcom–Google TPU agreement; Anthropic to access 3.5 GW of Google chips |
The energy figure is what gives this one a unique feel. In the past, tech companies discussed chips in terms of floating-point operations and nanometers. They now use gigawatts to communicate. The unit itself has changed, and that change conveys a message. It’s difficult not to wonder if anyone in the room has fully considered what Zuckerberg’s quote—”the massive computing foundation we need to deliver personal superintelligence to billions of people”—means physically. substitutes. cooling towers. rights to water. land transactions in areas like rural Iowa or Mesa, Arizona, where the construction of data centers has already altered local politics.
Additionally, there is a very identifiable industry pattern here. In 2015, Google entered the custom silicon market with its TPUs, at a time when most people believed Nvidia would be the industry leader. In 2018, Amazon came next. In 2023, Meta came late to the party but had a plan and a checkbook when it announced its own MTIA chips. Now, in 2026, all of the major hyperscalers are quietly attempting to wean themselves off of Nvidia while continuing to write Nvidia huge checks. The entire narrative is a contradiction.

Conversely, Broadcom is experiencing the kind of decade that businesses long for. It secured a separate agreement to supply Google with 3.5 gigawatts of TPU chips intended for Anthropic two weeks prior to the Meta announcement. It now sits next to practically every significant AI lab and platform as an indispensable partner. Even Hock Tan’s departure from Meta’s board seems less like a retreat and more like a way to settle disputes before the real money starts flowing because of how well he has positioned the company.
Beneath all of this, however, is a more subdued question that no one seems to want to voice. Is any of this a profitable venture yet? In 2025 alone, Meta invested about $135 billion in AI infrastructure. The payoff in terms of revenue is uncertain. Although intriguing, the user-facing AI products are not yet revolutionary enough to warrant a power plant’s worth of silicon. There is a feeling that the industry is constructing cathedrals before it is certain that anyone will come to pray, which could be erroneous or prophetic.
Perhaps demand catches up. Perhaps these gigawatts will be filled by the use of personal AI assistants by billions of people. Or perhaps, in a few years, incomplete data centers will cool in the desert, monuments to a hunger that proved to be greater than the meal.