• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Advertise
  • Subscribe

Connector Tips

Connector Tips has connector and electrical connector news, product highlights and and editorial coverage.

  • Products
    • board-to-board
    • cable-to-board
    • power
    • RF
    • USB
    • wire-to-board
  • Electronics
    • bonding
    • copper
    • fiber
    • gold
    • optical
    • transistor sockets
  • Markets
    • Aerospace
    • Automation
    • Automotive
    • Electrification
    • Electrical & Instrumentation
    • Medical
    • Military
    • Off-Highway
    • Oil/Gas
    • Telecom/Data
  • Learn
    • Basics/FAQs
    • eBooks/Tech Tips
    • EE Training Days
    • EE Learning Center
    • Tech Toolboxes
    • Webinars & Digital Events
  • Resources
    • Design Guide Library
    • Digital Issues
    • Engineering Diversity & Inclusion
    • LEAP Awards
    • White Papers
    • DesignFast
  • Videos
    • EE Videos
    • Teardown Videos
  • Newsletter Subscription
  • Suppliers

What is the role of liquid cooling connectors in AI data centers?

February 26, 2025 By Aharon Etengoff Leave a Comment

Artificial intelligence (AI) and machine learning (ML) applications consume significant power and generate considerable heat in data centers. High-performance AI accelerators — such as graphics processing units (GPUs), tensor processing units (TPUs), and application-specific integrated circuits (ASICs) — increasingly require more efficient cooling methods to maintain safe and optimal thermal operating levels.

This article discusses the growing energy demands of AI and ML and explores the rise of liquid cooling for these high-performance workloads. It also reviews key design requirements for liquid-cooling connectors and highlights evolving industry standards formulated by the Open Compute Project (OCP).

The increasing energy demands of AI and ML

Accounting for 10% to 20% of all energy consumed in US data centers (Figure 1), AI-driven applications are considerably more power-intensive than many conventional workloads. For example, a ChatGPT query draws ten times more energy than a standard Google search. As computational power requirements for AI model training double every nine months, data centers may soon consume as much energy as entire countries.

Figure 1. An inside look at a data center showcasing extensive server infrastructure. (Image: DataCenterKnowledge)

With thermal design power (TDP) requirements reaching 1500W and average rack power increasing from 8.5 kW to 12 kW, effective cooling systems are critical to maintaining optimal data center temperatures of 70 to 75°F (21 to 24°C). Cooling infrastructure now accounts for approximately 40% of total energy consumption in some of theties, prompting organizations such as The Green Grid to develop a Liquid Cooling Total Cost of Ownership Calculation Tool (tggTCO).

The rise of liquid cooling for AI and ML workloads

Many liquid cooling systems circulate dielectric fluids or water-based solutions through pipes or channels placed near or directly on components like GPUs. This process effectively dissipates thermal buildup in data centers running a wide range of high-performance AI and ML applications, large learning models (LLMs), and training sets. These mixtures offer superior thermal conductivity and greater heat transfer capacity than traditional air cooling, fan-based systems, or passive heat sinks.

Figure 2. Data center immersion-cooling system with rack-mounted service rails for easy maintenance and hot swaps. (Image: GreenRevolutionCooling)

Data centers typically implement liquid cooling using two primary methods: cold plate and immersion cooling (Figure 2). Cold plate cooling circulates dielectric coolant over or near the hottest components, delivering high performance at the chip level yet still relying on supplemental air cooling to dissipate residual heat. As rack densities increase, cold plate liquid cooling scales more efficiently than stand-alone air-cooling systems, which often struggle to dissipate heat from densely packed equipment.

Significantly reducing the use of auxiliary fans, immersion cooling further improves energy efficiency by dissipating, recapturing, and reusing nearly 100% of generated heat. This cooling method, however, often requires new facility designs, structural modifications, and upgraded or new power distribution systems.

Precision liquid cooling, which occupies a middle ground between cold plate and immersion cooling, uses minute amounts of dielectric coolant to target the hottest components and effectively cool the entire system. This hybrid method, which eliminates water use for cooling, can reduce energy consumption by up to 40%.

Key performance requirements for liquid cooling connectors

When designing liquid-cooled AI systems, data center architects select connectors that meet key performance requirements, such as resisting temperatures up to 50°C (122°F), handling coolant flow rates up to 13 liters per minute (LPM), and maintaining pressure drops around 0.25 psi.

Figure 3. Data center infrastructure submerged in a liquid cooling solution. (Image: AKCP)

Additionally, these connectors ensure easy serviceability and compatibility with water-based or dielectric fluid (Figure 3) mixtures, preventing corrosion and leaks. Liquid cooling connectors also integrate seamlessly with in-rack manifolds and existing cooling infrastructure.

Additional key liquid cooling connector features include:

  • Quick disconnect: facilitates easy, dripless connection and disconnection for routine maintenance and emergency access in AI and ML data centers.
  • Large diameter: accommodates high flow rates, typically with a 5/8-inch inner diameter for server cooling in AI racks.
  • Thermal resistance: optimizes heat transfer by reducing thermal resistance, which is critical for cooling efficiency.
  • Manifold compatibility: aligns fluid connectors with three-inch square stainless-steel tubing for optimized coolant distribution.
  • Hybrid designs: combines high-speed data transfer and liquid cooling channels for AI systems.
  • Rugged designs: ensure durability and prevent leaks in challenging conditions, such as fluctuating temperatures, abrupt pressure drops, and strong vibrations.

Many companies, such as CPC (Colder Products Company), Koolance, Parker Hannifin, Danfoss Power Solutions, and CEJN, offer liquid cooling connectors for high-performance AI workloads in the data center. These manufacturers provide various quick disconnect fittings, couplings, and other components designed to manage thermal efficiency.

Evolving industry standards for liquid-cooling connectors

Industry organizations like the Open Compute Project (OCP) are developing open standards for liquid cooling connectors in data centers. The evolving OCP Large Quick Connector Specification outlines a universal quick connect, with standardized interface dimensions and performance requirements.

These include a working pressure of 35 psi at 60°C, a maximum operating pressure of 175 psi (12 bar), a flow rate of over 100 liters per minute (LPM), and ergonomic designs limiting mating torque to less than 5 Nm. Connectors must also handle temperatures from -4°F to 140°F (-20°C to 60°C), with shipping ranges of -40°F to 158°F (-40°C to 70°C). Additional criteria specify fluid loss under 0.15 mL per disconnect and a service life of at least 10 years of continuous use.

Summary

High-performance AI accelerators increasingly require efficient cooling to maintain safe, optimal thermal levels in data centers. Liquid cooling systems, which circulate dielectric fluids or water-based solutions near or directly on GPUs and TPUs, provide superior thermal conductivity and capacity compared to traditional air cooling, fan systems, or passive heat sinks. Liquid cooling connectors, designed for demanding environments, must resist temperatures up to 50°C (122°F), handle flow rates up to 13 LPM, and maintain pressure drops around 0.25 psi.

Related EE World Online content

How Are High-Speed Board-to-Board Connectors Used in ML and AI Systems?
Driving Standards: The Latest Liquid-Cooling Cables and Connectors for Data Centers
Where Are Liquid Cooled Connectors and Connectors for Liquid Cooling Used in EVs?
Where Are Liquid-Cooled Industrial Connectors Used?
Liquid Cooling For High-Performance Thermal Management

References

The Basics of Liquid Cooling in AI Data Centers, FlexPower Modules
High-Power Liquid Cooling Design: Direct-to-Chip Solution Requirements for 500-kW Racks, ChillDyne
Six Things to Consider When Introducing Liquid Cooling Into Your Data Center, Data Center Dynamics
Harnessing Liquid Cooling in AI Data Centers, Power Electronics News
How AI Is Fueling a Boom in Data Centers and Energy Demand, Time
Cooling the AI Revolution in Data Centers, DataCenterFrontier
Data Center Cooling: The Unexpected Challenge to AI, Spectra
Supporting AI Workloads: The Future of Data Center Cooling, DataCenterPost
The Advantages of Liquid Cooling, Data Center Frontier
Answering the Top FAQs on AI and Liquid Cooling, Schneider Electric Blog
Large Quick Connector Specification, Open Compute Project
How Immersion Cooling Helps Reduce Operational Costs in Data Centers, GRC

You may also like:


  • Where are wet mate connectors used?

  • What types of connectors are used in downhole drilling rigs?

  • What’s a thermal connector?

  • How do the OCP immersion cooling requirements effect connectors?

  • What’s an EMC connector?

Filed Under: AI/ML, Featured Tagged With: CEJN, CPC, Danfoss Power Solutions, Koolance, Parker Hannifin

Reader Interactions

Leave a Reply Cancel reply

You must be logged in to post a comment.

Primary Sidebar

Featured Contributions

From extreme to mainstream: how industrial connectors are evolving to meet today’s harsh demands

The case for vehicle 48 V power systems

SMP3 vs. SMPS: why two standards?

mmWaves bring interconnect challenges to 5G and 6G

Ensuring integrity in high-performance interconnects with connector backshells

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE LEARNING CENTER

EE Learning Center

RSS Current EDABoard.com discussions

  • Phase Shift Full Bridge suffers spurious FET turn_ON
  • optimum spacing between feed and sub reflector
  • Equipment to see if household should buy battery/solar/inverter?
  • 'HERIC' pairs of IGBTs essential for Mains inverters
  • reverse polarity circuit protection between to power sources

RSS Current Electro-Tech-Online.com Discussions

  • Epson crystal oscillators
  • Simple LED Analog Clock Idea
  • Fun with AI and swordfish basic
  • Microinverters and storeage batteries?
  • PIC KIT 3 not able to program dsPIC

EE ENGINEERING TRAINING DAYS

engineering
“bills
“connector
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

Footer

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • DesignFast
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Sensor Tips
  • Test and Measurement Tips

Connector Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy