Tech Cache

Tech Cache

Vertiv: The Data Center Infrastructure Play With A Catch

Bloom Energy powers the AI data center from the outside. Vertiv powers and cools everything inside it, and the business case has never been stronger. But the chart is telling a more cautious story.

Joe Albano's avatar
Joe Albano
Apr 10, 2026
∙ Paid

The Other Side Of The Data Center Trade

At the beginning of this year, I said I would branch out into the data center side of the technology world as the mainstream tech names continue in the long-running news cycle around them. I introduced Bloom Energy BE 0.00%↑ as my first new coverage in the sector. And it made sense as over the last year or two, the conversation around data center infrastructure has been almost entirely about power - getting enough electricity into the building to feed the AI factories being built. Bloom Energy sits on the supply side of that problem, generating power at the point of need. But once the electricity gets inside the building, a separate and equally urgent problem takes over. Someone has to manage it.

That’s where Vertiv VRT 0.00%↑ comes in.

The company makes, installs, and maintains the power distribution, thermal management, and cooling infrastructure sitting between the building’s power supply (particularly the uninterruptible power supplies (UPSs)) and the racks themselves. For years, this was a relatively unglamorous business. After all, it wasn’t difficult managing power and keeping equipment cool in data centers running conventional workloads. However, AI quickly sent data centers into a race to keep up with the intense compute demand, ranging from extreme power demands to completely new infrastructure. But one of the largest problems AI data centers face is heat.

A standard server rack running conventional compute draws roughly 10-15 kilowatts. A rack optimized for AI inference today draws 120-150 kilowatts. Air cooling, the standard approach for decades, simply cannot dissipate heat at such power densities. The transition to liquid cooling isn’t a nice-to-have option anymore; it’s a physical necessity, and Vertiv is the dominant supplier of the infrastructure making it work.

Understanding This Market

While the power distribution is an important part, it’s not nearly as new and novel as liquid cooling. While liquid cooling has been around for sometime, even in home desktop settings, there’s now a point at which it becomes a requirement to build an AI data center.

Now, it’s important to understand this isn’t the enthusiast cooling loop inside a gaming PC. The loop is far less complicated at home, even if there’s a CPU, chipset, and GPU cooling block being used. Moreover, when a gaming PC’s cooling system fails, a processor is lost. When a liquid-cooled AI data center rack fails at 150 kilowatts of density, millions of dollars of hardware, an active training run, and potentially tightly coupled multi-rack clusters are damaged, destroyed, or put offline. The stakes are categorically different, and so is the complexity required to manage them.

A liquid-cooled data center at hyperscale involves a secondary fluid network running through hundreds of connection points, each requiring precise pressure management, valve control, temperature balancing, and chemical treatment to maintain fluid integrity. A miscalibrated system not only gets throttled; it can destroy hardware. Hardware which is now scaling into the millions of dollars with systems like Nvidia’s NVDA 0.00%↑ GB300 and soon to be Vera Rubin NVL72, which will push densities north of 190-230 kilowatts.

Vertiv’s PurgeRite acquisition addressed this end of the business, deepening its fluid management capability from initial commissioning through the full lifecycle of the facility. As CTO Scott Armul said on the Q4 earnings call, deploying liquid cooling at this scale is not for the faint of heart, and it requires experience new entrants simply haven’t built yet.

Air cooling, the industry standard for decades, can’t dissipate heat at those densities. The physics simply don’t support it, and liquid cooling has become the only viable option. And this is no more driven home by the fact the Vera Rubin rack scale systems will require 45 degree Celcius water cooling inlets.

So, the question becomes, how does this work for a business, what does this data center market translate to financially, and are there competitive concerns?

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2026 Tech Cache LLC · Market data by Intrinio · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture