Continue reading this on our app for a better experience

Open in App
Floating Button
Home Digitaledge In Focus

How AI training will maximise business value for organisations

Chris Sharp
Chris Sharp • 5 min read
How AI training will maximise business value for organisations
Data centres are like schools for AI as they provide the infrastructure to train AI models and help them reach peak performance. Photo: Unsplash
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

ChatGPT has put artificial intelligence at the forefront of public consciousness with its accessibility and relative ease of use. Once perceived to be accessible only to experts and large enterprises, it has democratised access to AI, making it available to consumers and smaller businesses.

As a result, businesses are looking to incorporate artificial intelligence (AI) technology across their organisations to benefit from increased productivity and efficiency. Super-app, Grab, for example, has augmented its search engine with AI to offer the closest matches and relevant suggestions to queries while taking into account local languages and nuances. The International Data Corp predicts that spending on AI in Asia-Pacific (excluding Japan) including software, services, and hardware for AI-centric systems, will grow to US$49.2 billion in 2026.

However, there is no one-size-fits-all approach to using generative AI. The nascent nature of generative AI requires enterprises to train their AI models on massive volumes of diverse data before these models can perform tasks efficiently or provide actionable insights. In essence, these AI models must go to “school” before they can be put to work.

Just as schools provide conducive environments for students to learn, data centres do the same for AI in this regard, by providing the infrastructure needed to train AI models and help them reach peak performance.

Organisations that wish to accelerate their AI efforts for more business value must pay attention to the “schools” that their AI systems are attending.

The modern AI school

See also: 80% of AI projects are projected to fail. Here's how it doesn't have to be this way

AI has intensive computational demands in order to perform tasks like image recognition, natural language processing and autonomous decision-making.

Top-tier AI “schools” have the ability to accelerate AI learning via AI accelerators, a category of specialised hardware that expedites AI and machine learning applications. These high-performance computing systems comprise cutting-edge processors and graphics processing units (GPUs) that deliver more computing power and optimised memory architectures to speed up AI model training time, much like the special facilities or classes that students attend to bring their performance to the next level.

Due to specific hardware and data processing requirements, these accelerators must run in facilities or “schools” with specialised environments that can provide the high-performance cooling, power densities and connectivity required for AI to operate effectively. One of the key elements is access to high-density power that can support the processors, GPUs and specialized hardware accelerators that power these systems. High-density power is essential to ensure that AI deployments can deliver real-time results and handle the enormous workloads required for tasks like training large models or processing data in real-world applications such as autonomous vehicles and healthcare diagnostics.

See also: Responsible AI starts with transparency

This computing intensity generates a lot of heat. As AI deployments grow in complexity and size, effective cooling to prevent overheating that degrades the performance, reliability and lifespan of AI hardware, is vital.  Effective cooling technology is crucial to managing the density of hardware within data centres, enabling efficient space utilisation without compromising the integrity of the systems.

However, while liquid cooling is a key piece of supporting dense, power-hungry AI systems, it does not replace air cooling in the data centre. For every rack of equipment filled with processing equipment that will start to require liquid cooling, there are others that support that rack, filled with storage and networking equipment, which are far less dense and will still be served economically with air cooling. Hybrid cooling offers the most capital-efficient way of supporting AI systems in both current and new data centres.

Conducive learning environment

Data is the foundation of training AI, and it is essential that AI systems have access to a “library” of up-to-date, accurate and diverse data sets. With real-time interconnectivity to facilitate the movement of these datasets to and from an ecosystem of partners, cloud and network providers, AI systems can have access to real-time data to train and fine-tune their AI system to provide accurate and timely predictions or insights.

Further, when AI systems are physically closer to the data and content they need, they can access and process information more quickly. This proximity to content and content providers to reduce latency and enhance performance is critical for applications like video streaming, online gaming, and real-time decision-making, where even slight delays can impact user experience or the effectiveness of AI algorithms. Proximity to content and providers also reduces the strain on network infrastructure, leading to more efficient data transfer and cost-effective AI operations, making it a key consideration in optimising AI performance.

Creating a foundation for success

As demand for generative AI services grows, organisations must seek data centre operators that provide the necessary infrastructure and low latency, secure, and flexible interconnections in order to access a comprehensive support ecosystem.     

Data centre providers that have supported many partners on their AI solutions will have the advantage in experience, especially those that have supported hyperscalers and co-designed data centres to meet their evolving AI needs. We have supported customers, such as Castle Global, SURFsara, and Graphcore with the infrastructure they needed to power their innovation, and we’re using our experience, expertise, and key partnerships to help enterprises that are ready to begin their AI journey today.

Chris Sharp is the chief technology officer of Digital Realty

×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2024 The Edge Publishing Pte Ltd. All rights reserved.