reCAPTCHA WAF Session Token
Data Science and ML

Can Modern Data Centers Keep up With the Exponential Growth of AI?

Thank you for reading this post, don't forget to subscribe!

Artificial intelligence’s exponential growth has stirred controversy and concern among data center professionals. How will facilities accommodate the fast-approaching high-density kilowatt requirements AI requires? As conventional solutions become less feasible, they must find a viable – and affordable – alternative. 

Data Centers Are Facing the Consequences of AI Demand 

AI’s adoption rate is steadily climbing across numerous industries. It increased to about 72% in 2024, up from 55% the previous year. Most metrics suggest widespread implementation isn’t a fleeting trend, indicating modern data centers will soon need to retrofit to keep up with its exponential growth. 

The recent surge in AI demand has long-term implications for the longevity of data center information technology (IT) infrastructure. Since a typical facility can last 15-20 years, depending on its design and modularization, many operators are ill-prepared for the sudden, drastic change they now face. 

For decades, operators have updated hardware in phases to minimize downtime, so many older data centers are crowded with legacy technology. Despite several massive technological leaps, fundamental IT infrastructure has changed very little. Realistically, while 10-15 kW per rack may be enough for now, 100 kW per rack may soon be the new standard.

What Challenges Are Data Centers Facing Because of AI? 

Current data center capacity standards may become inadequate within a few years. The resource drain will be significant whether operators augment equipment to perform AI functions or integrate model-focused into existing hardware. Already, these algorithms are driving the average rack density higher.

Currently, a standard facility’s typical power density ranges from 4 kW to 6 kW per rack, with some more resource-intensive situations requiring roughly 15 kW. AI processing workloads operate consistently from 20 kW to 40 kW per rack, meaning the previous upper limit has become the bare minimum for algorithm applications.

Thanks to AI, data center demand is set to more than double in the United States. One estimate states it will increase to 35 gigawatts (GW) by 2030, up from 17 GW in 2022. Such a significant increase would require extensive reengineering and retrofitting, a commitment many operators may be unprepared to make.

Many operators are concerned about power consumption because they need up-to-date equipment or an increased server count to train an algorithm or run an AI application. To accommodate the increased demand for computing resources, replacing central processing unit (CPU) servers with high-density racks of graphics processing units (GPUs) is unavoidable. 

However, GPUs are very energy intensive – they consume 10-15 times more power per processing cycle than standard CPUs. Naturally, a facility’s existing systems likely won’t be prepared to handle the inevitable hot spots or uneven power loads, impacting the power and cooling mechanisms’ efficiency substantially.

While conventional air cooling works well enough when racks consume 20 kW or less, IT hardware won’t be able to maintain stability or efficiency when racks begin exceeding 30 kW. Since some estimates suggest higher power densities of 100 kW are possible – and may become more likely as AI advances – this issue’s implications are becoming more pronounced. 

Why Data Centers Must Revisit Their Infrastructure for AI

The pressure on data centers to reengineer their facilities isn’t a fear tactic. Increased hardware computing performance and processing workloads require higher rack densities, making equipment weight an unforeseen issue. If servers must rest on solid concrete slabs, simply retrofitting the space becomes challenging. 

While building up is much easier than building out, it may not be an option. Operators must consider alternatives to optimize their infrastructure and save space if constructing a second floor or housing AI-specific racks on an existing upper level isn’t feasible. 

Although data centers worldwide have steadily increased their IT budgets for years, reports claim AI will prompt a surge in spending. While operators’ spending increased by approximately 4% from 2022 to 2023, estimates forecast AI demand will drive a 10% growth rate in 2024. Smaller facilities may be unprepared to commit to such a significant jump. 

Revitalizing Existing Infrastructure Is the Only Solution

The necessity of revitalizing existing infrastructure to meet AI demands isn’t lost on operators. For many, modularization is the answer to the rising retrofitting urgency. A modular solution like data center cages can not only protect critical systems and servers, they can support air circulation to keep systems cool and provide an ease to scale as more servers are needed. 

Accommodating training or running an AI application – while managing its accompanying big data – requires an alternative cooling method. Augmented air may work for high-density racks. However, open-tub immersion in dielectric fluid or direct-to-chip liquid cooling is ideal for delivering coolant directly to hot spots without contributing to uneven power loads. 

Operators should consider increasing their cooling efficiency by raising the aisle’s temperature by a few degrees. After all, most IT equipment could tolerate a slight elevation from 68-72 F to 78-80 F as long as it remains consistent. Minor improvements matter because they contribute to collective optimization. 

Alternative power sources and strategies are among the most important infrastructure considerations. Optimizing distribution to minimize electricity losses and improve energy efficiency is critical when AI requires anywhere from 20 kW to 100 kW per rack. Eliminating redundancies and opting for high-efficiency alternatives is necessary. 

Can Data Centers Adapt to AI or Will They Be Left Behind?

Data center operators may be willing to consider AI’s surging demand as a sign to overhaul most of their existing systems as soon as possible. Many will likely shift from conventional infrastructure to modern alternatives. However, tech giants running hyperscale facilities will have a much easier time modernizing than most. For others, retrofitting may take years, although the effort will be necessary to maintain relevance in the industry.

The post Can Modern Data Centers Keep up With the Exponential Growth of AI? appeared first on Datafloq.

Back to top button
WP Twitter Auto Publish Powered By : XYZScripts.com
SiteLock