Intel says it expects to see greater adoption of liquid cooling, and is working with the Open Compute Project (OCP) and cooling vendors to put forward standards to make the technology more accessible.

At the Open Compute Summit last week in San Jose, Intel’s Zane Ball discussed the chipmaker’s growing focus on immersion cooling.

“Air cooling has been with us for a long time, and people have talked about liquid cooling for a long time,” said Ball, Intel’s Corporate VP and General Manager Datacenter Engineering and Architecture. “It’s always that thing we’re going to do in the future. We believe we’ve reached a time where liquid cooling have to play a much bigger role in the data center.


“I believe there’s been more energy around immersion cooling in the last year that I’ve heard in a very long time,” Ball continued. “I think the time is now.”

Density Prompts Greater Focus on Cooling

Immersion cooling submerges servers in a tank filled with liquid coolant, rather than using cold air. This approach offers potential economic benefits by allowing data centers to operate servers without a raised floor, computer room air conditioning (CRAC) units or chillers.

Interest in liquid cooling has been boosted by the growth of artificial intelligence, which relies upon powerful hardware that packs more computing power into each piece of equipment, boosting the power density – the amount of electricity used by servers and storage in a rack or cabinet – and the accompanying heat. These rising power densities are challenging traditional practices in data center cooling, and prompting data center operators to adapt new strategies to support high-density racks.

But interest in advanced cooling is no longer just about density and “hot hardware,” according to a recent Data Center Frontier roundtable of data center experts, who cite a confluence of factors that includes sustainability and edge deployments.

Liquid coolant being added to an immersion tank housing servers. (Credit: Submer)

Intel’s support, along with the growing focus on immersion within the Open Compute community, could build momentum for the technology. (EDITOR’S NOTE: Watch DCF for more coverage ahead on the liquid cooling initiatives from OCP and Open19). 

One of the challenges is that although many processors and servers have been tested in immersion environments, manufacturers will not warranty their products when they are immersed. Intel is positioned to help, and says it is working on this.

“One of the problems that we need to solve is the liquids themselves,” said Ball. “The liquids touch every component, so every component somehow needs to be qualified to work with the liquid, and it’s a problem. So we need to standardize the liquid so that systems know what to expect from their liquid.

“Intel is leading here,” he continued. “We’re bringing out a specification with partners through OCP. And we are eager to engage in the rest of the industry to make liquid standardization happen so that we can scale immersion cooling.


“There’s other problems we need to solve too, but this is foundational,” said Ball. “Because of our ability to get a good liquid ecosystem going, Intel’s going to be able to now offer warranties on our components with emerging cooling, which is a big step forward. So I’m really excited about it.”

Intel Steps Up Investment, Engagement

Over the past year, Intel has announced collaborations with several immersion vendors, as well as investments to advance the technology. Some examples:

  • Intel has collaborated with Submer to use Xeon-based immersion-optimized server boards and Submer’s precision cooling technique to demonstrate the reuse of high-grade waste heat generated by servers.
  • Intel has teamed with GRC (Green Revolution Cooling) to create a white paper detailing how an integrated use of Intel processors and GRC immersion cooling product can make data centers more sustainable.
  • In June, Iceotope demonstrated its Ku:l Data Center chassis-level Precision Immersion Cooling system at the Intel Booth at HPE Discover 2022. The collaboration between Iceotope, Intel and HPE promises “a faster path to net zero operations by reducing edge and data center energy use by nearly a third.”
  • In May Intel rolled out a proof-of-concept immersion cooling facility in Taiwan, saying Intel “aims to simplify and accelerate the implementation of immersion liquid cooling solutions throughout the ecosystem globally.”
  • Intel also unveiled plans to invest more than $700 million for a 200,000-square-foot, state-of-the art research and development mega lab focused on innovative data center technologies, including liquid cooling.

Some vendors see coolant as a place to compete, rather than collaborate. Several players in the field have developed proprietary cooling fluids that they market as differentiators.

A number of hyperscale operators have explored the use of immersion cooling in their platforms, most notably Microsoft, which is running a small immersion installation in production. But Google and Meta have opted for alternative designs that bring liquid directly to the chip, with Meta embracing an approach called air-assisted liquid cooling (AALC) that has also been advanced through the OCP.

Reducing Use of Fans, Water

Ball sees several use cases where immersion can offer advantages over traditional air cooling, including operating in warm climates.

“If you’re in a humid climate, evaporative cooling is really challenged in some of those climates, Ball said, adding that achieving a lower PUE, “doesn’t even begin to scratch the surface of the value.

“Once we moved into immersion cooling, we can take the fans completely out of the system,” Ball said. “Fan power is a major contributor to overall data center energy consumption.

“And let’s not forget about the water consumption. In North America alone, data centers consume hundreds of millions of gallons of water per year. And all of that can go away. It’s a much more sustainable capability.”

Many of these potential advantages for immersion cooling have been well known for many years, but adoption has remained limited to extreme density applications like HPC, bitcoin mining and seismic imaging for the energy industry.

“So why haven’t we already been there?,” said Ball. “Well, because it’s kind of hard. There’s a lot of problems to solve, and the costs of entry are too high. And though a bunch of companies, from very large ones to innovative startups are joining this ecosystem to solve those problems, we still have work to do.”