Up to 40% of the total energy consumed by a modern data centre is used for cooling. Data centres are typically maintained at 22°C or lower temperatures to ensure that their IT equipment, particularly servers, operates efficiently and does not overheat. Singapore’s warm and tropical climate poses further challenges for cooling data centres.
Air cooling is the most common and traditional method of cooling data centres. It uses air conditioning, fans and vents to circulate ambient air and expel the hot air produced by computing equipment.
“If you set the data centre’s temperature too low, the compressor (which removes heat from the air and expels it outside) will have to work very hard and consume more energy. So, raising the data centre operating temperature can translate to additional percentage savings on the compressor,” says Professor Lee Poh Seng, executive director of the Energy Studies Institute (ESI) at the National University of Singapore (NUS). He is also the programme director of the Sustainable Tropical Data Centre Testbed.
Heat rejection, he adds, is also more difficult in humid environments. In this process, a data centre’s cooling tower transfers excess heat from a cooling system to the environment through evaporation. “Just like how the human body struggles to remove heat in humid conditions as the sweat doesn’t evaporate easily, the cooling tower needs to consume more energy for heat rejection in humid environments. This environment will drive up the power usage effectiveness (PUE),” he tells DigitalEdge.
PUE refers to the ratio of a facility’s total electricity usage to the power consumed by the core IT equipment. The closer the PUE is to 1, the more efficient the data centre is, as less energy goes towards cooling. Singapore currently requires data centres to have a PUE of at least 1.3.
Cooling innovations for the tropics
See also: Responsible AI starts with transparency
Since cooling technologies can significantly impact PUE or energy efficiency, Singapore launched the Sustainable Tropical Data Centre Testbed (STDCT) last year to accelerate the development and adoption of cooling technologies for tropical data centres.
The flexible, full-scale facility enables NUS and Nanyang Technological University (NTU) researchers to work with industry partners to experiment and validate innovative cooling ideas. It also serves as a de-risking platform for companies to test and optimise new technologies in a realistic, tropical setting.
The STDCT, says Lee, is an extension of a project between NTU, Dell Technologies and Keppel Data Centres a few years ago to test IT hardware reliability under elevated data centre operating temperatures. He adds: “The STDCT [brings together academia and data centre ecosystem players] to conduct a holistic assessment for sustainable data centre technologies. Beyond assessing whether or not IT hardware can operate at a higher data centre temperature, we will also look at the potential of power penalty (or the increase in electricity needed to operate IT systems at higher temperatures), the energy savings from cooling systems, and more.”
See also: Mitigating the third-party identity threat
According to STDCT, its projects aim to help data centres reduce energy consumption and carbon dioxide emissions by up to 40%, decrease water usage by 30% to 40%, and achieve a PUE of less than 1.2 for a combination of air and liquid cooling.
Dell Technologies is among the partners that are actively engaged in technology co-development. “If IT equipment can run at room temperature [of between 26°C and 30°C], data centres won’t need [as much] air conditioning. This is why we’re working with STDCT to ensure direct-to-chip cooling and immersion cooling solutions work well on our IT and computing systems. We will also provide technical findings as STDCT develops cooling solutions,” says Andy Sim, Dell Technologies’ vice president and managing director for Singapore.
He adds: “So far, we’ve tested our computing systems to 32°C and haven’t seen performance degradation. Our systems also have temperature sensors or built-in thermostats, so they automatically switch off upon reaching the danger point, ensuring safety.”
Liquid cooling
Direct-to-chip and immersion cooling fall under the liquid cooling method, which is more effective than air cooling in drawing heat away from high-power density racks in data centres.
In direct-to-chip cooling, cold plates sit atop the heat-generating chips inside a server to draw away heat through single-phase cold plates or two-phase evaporation units. This approach can eliminate most of the heat the rack equipment generates, leaving the remaining heat to be removed by air-cooling systems.
Meanwhile, immersion cooling systems submerge servers and other rack components in a thermally conductive dielectric liquid or fluid. This maximises the liquid’s thermal transfer properties and eliminates the need for air cooling.
To stay ahead of the latest tech trends, click here for DigitalEdge Section
The STDCT has conducted a side-by-side comparison of air and liquid cooling to show the latter’s effectiveness.
With liquid cooling, we reduced the PUE from 1.4 to about 1.08. Interestingly, it also reduced IT power consumption by 30%, which is substantial since IT is the biggest energy ticket item. So, liquid cooling can greatly help reduce a data centre’s entire energy and carbon footprint. [By working with industry partners like Dell Technologies, we can ensure] liquid cooling delivers energy savings for cooling systems and IT equipment while maintaining optimal IT performance.
Professor Lee Poh Seng, executive director of the Energy Studies Institute (ESI), NUS; and STDCT's programme director
Brownfield data centres
Lee claims that optimising brownfield data centres is the way to deeply decarbonise mature data centre markets like Singapore. To ease this process, tech companies need to partner and develop integrated packages for data centres.
“For example, a rack-based solution that integrates the appropriate power solution with liquid cooling will enable you to extract the best compute performance while ensuring optimal energy efficiency and carbon emissions. Data centre operators also need to futureproof themselves by moving towards modular data centres, [which are more flexible than traditional data centres as they are formed by assembling individual modules designed and built separately],” adds Lee.
The Dell AI Factory is the company’s answer to enabling modular data centres.
We’ve brought in ecosystem partners to offer an end-to-end AI portfolio spanning client devices, servers, storage, data protection and networking via the Dell AI Factory. Data centre operators can choose what they need to achieve their desired business outcomes.
Andy Sim, vice president and managing director for Singapore, Dell Technologies
Combined with an open ecosystem, this modular approach helps data centre operators keep pace with changing customer demands, such as supporting the use of AI.
Widespread adoption
The STDCT is key in encouraging the widespread adoption of sustainable technologies for tropical data centres, including cooling solutions. “By bringing together the industry, academia and government, the STDCT will help chart the path forward for Singapore to achieve its aspiration to become a global AI hub that sustainable tropical data centres support,” says Sim.
Lee adds: “The STDCT is a visible demo platform that shows how those technologies can be deployed effectively with minimal risks, which helps facilitate commercial deployment. Up to 50 organisations have visited the STDCT, including customers of STDCT’s tech partners, system integrators and foreign dignitaries from neighbouring countries.”
Talent development and capability building will be key focus areas for the next phase of STDCT, which is expected to start in the first quarter of next year. “Most data centre technicians and engineers are well-versed in air cooling but not liquid cooling, which is a relatively new area. We’ll work with Dell Technologies [and other partners] to equip them with relevant skills and knowledge such as checking the quality of the water, knowing when the cooling system needs maintenance, how to handle the IT equipment submerged in the cooling fluid and more,” says Lee.
As part of the holistic assessment, STDCT’s phase two will also look at innovative ways of heat rejection and eliminating water usage, adds Lee. The latter is an increasing concern as data centres tend to consume significant amounts of water for cooling, stressing local water sources. Moreover, STDCT will explore the use of AI for the predictive maintenance of sustainable data centres.