The Biggest AI Data Center Stories That Shaped 2024

There was no scarcity of predictions over the previous a number of years about how AI will influence the info middle trade and the way knowledge facilities will assist form the evolution of AI. We’ve been advised, for instance, that AI will result in a surge in power consumption by knowledge facilities and spur major new data center construction efforts.

Over the course of the previous 12 months, it has began to grow to be clearer which of those predictions and tasks will play out. We now have proof, for instance, about how AI is contributing to knowledge middle firm profitability, in addition to a few of the challenges (like chip shortages) the trade faces in deploying AI workloads on a big scale.

On the similar time, nevertheless, as-yet-unproven projections and predictions about AI’s influence on the info middle trade continued to abound over 2024. There stay many opinions about what’s coming subsequent – like new “edge AI” architectures – and little in the best way of proof demonstrating that these opinions are correct.

For full particulars on these and different knowledge middle tendencies involving AI, right here’s a breakdown of the highest Information Middle Data tales on this vein from the previous 12 months.

A July 2024 report from Moody’s concerning the growth of knowledge middle capability in response to the AI growth was notable not as a result of it predicted the seemingly apparent – that corporations will construct extra knowledge facilities to accommodate AI workloads over the approaching years – however as a result of it quantified the projected influence of that growth on knowledge middle operations. Most notably, Moody’s predicted that AI will contribute to a 23 % general improve in knowledge middle power consumption between 2023 and 2028, and that power used tied to AI workloads particularly will develop by an annual charge of 43% over the identical interval.

Associated:2024’s Biggest Data Center Construction Stories: A Year in Review

These numbers are predictions, and so they might transform unsuitable. However if you wish to know in quantitative phrases precisely how AI is poised to influence knowledge middle operations, this supply is nearly as good as any.

One other report, launched in September by the Dell’Oro Group, supplied quantitative perception into what AI means for knowledge middle spending. It discovered that knowledge middle capital expenditures surged by 46% in simply the second quarter of 2024 – a pattern that, if it holds, means that AI will gas an infinite improve in knowledge middle funding and spending over the close to future. The expansion displays not simply AI {hardware} purchases but additionally the ability and cooling programs essential to assist AI gadgets in knowledge facilities.

Associated:AI Data Centers Pose Regulatory Challenge, Jeopardizing Climate Goals – Study

BlackRock additionally chimed in in 2024 with projections for AI’s influence on knowledge middle progress. Its numbers had been much less exact, but it surely predicted that AI knowledge facilities will develop in capability by between 60% and 80% per 12 months over the following a number of years.

On steadiness, it’s value noting that the corporate didn’t outline precisely what an “AI data center” is or the extent to which the growth of knowledge facilities on this class will contribute to general knowledge middle capability. Nonetheless, because the opinion of an organization whose enterprise is to foretell and capitalize on main financial tendencies, BlackRock’s projections about AI’s function in knowledge middle growth are vital.

In one other knowledge level that no less than not directly correlates with AI’s function in knowledge middle progress, Equinix, which operates knowledge facilities internationally, attributed its 8% improve in income this 12 months largely to AI. Not coincidentally, the corporate can be within the midst of quickly expanding its data center footprint.

Equinix didn’t present particulars about how a lot of its income progress was on account of AI workloads particularly, and its CEO cautioned that it will take time for the trade to really feel the complete weight of AI. Nonetheless, for those who’re prepared to exit on a limb and assume that correlation implies causation to some extent, it’s an inexpensive conclusion that the AI growth – which correlates with monetary success this 12 months for Equinix – is no less than beginning to pay dividends for knowledge middle operators.

Associated:How LLMs on the Edge Could Help Solve the AI Data Center Problem

One other signal of AI’s influence on knowledge middle enterprise methods was a pivot by knowledge middle operators from amenities that cater to cryptocurrency mining towards ones targeted on AI. That’s the transfer Iris Power described to Information Middle Data this 12 months.

This shift makes good sense on condition that curiosity in cryptocurrency has usually waned lately and that the identical varieties of infrastructure and gadgets – like GPUs – that excel at crypto-mining additionally work properly as AI {hardware}. However the pattern is notable all the identical as a result of it means that, no less than to some extent, expanded knowledge middle capability to assist AI will come within the type of crypto-mining amenities which are repurposed for AI, reasonably than brand-new knowledge facilities. On this sense, the repurposing of cryptocurrency knowledge facilities for AI workloads might cut back the quantity of latest knowledge middle investments fueled by the AI growth.

Different notable – albeit not precisely goal – opinions concerning the function of AI in knowledge facilities and past appeared this 12 months from the CEOs of Nvidia and Meta. Talking at SIGGRAPH this summer season, the executives talked about, amongst different matters, how their corporations are utilizing AI internally – together with to assist handle knowledge middle operations for Nvidia, in keeping with feedback by Nvidia CEO Jensen Huang.

The dialogue supplied few technical particulars, so it’s difficult to attract takeaways about what the usage of AI inside knowledge facilities by corporations like Nvidia and Meta really entails, or what it might portend for AI’s impact on the way data centers operate. Nevertheless it’s nonetheless fascinating that these corporations – each of which promote AI merchandise, after all, and subsequently have incentive to advance the narrative about AI’s more and more central function in trendy companies – need to say concerning the inner use of AI.

It’s one factor to develop the capability of knowledge facilities for internet hosting AI workloads. It’s one other to deploy the precise server infrastructure that helps these workloads – and due to the scarcity of high-bandwidth reminiscence (HBM) chips reported this 12 months, there’s a threat that the growth of AI-friendly knowledge middle area will outpace the expansion of AI-friendly servers. That’s as a result of HBM chips are used to fabricate GPUs, that are continuously used for AI coaching and inference.

That is an instance of one of many challenges the info middle trade might want to overcome to maintain continued progress in response to the AI growth.

Protecting AI infrastructure cool is one other elementary problem that will hinder continued knowledge middle growth. That’s very true as a result of elevated frequency and depth of warmth waves. AI chips produce a whole lot of warmth beneath any circumstances, however dissipating the warmth turns into even tougher when the ambient temperature surrounding a knowledge middle surges on account of warmth waves.

That is one motive why innovative data center cooling technologies, which may dissipate warmth in energy-efficient methods, are prone to grow to be a key aspect in continued knowledge middle growth within the age of AI.

So-called edge AI might contribute to methods for improving the data center sustainability within the age of AI. Edge AI means having AI workloads course of knowledge on the community edge as an alternative of in centralized knowledge facilities. Doing so might cut back power consumption and enhance efficiency by lowering the quantity of knowledge transmission required to deploy AI.

On steadiness, it’s value noting that AI processes like coaching are inclined to require massive quantities of power irrespective of the place they happen – whether or not on the edge or in a standard knowledge middle – so edge AI is unlikely to cut back power consumption very dramatically. Nonetheless, there may very well be some tangible sustainability advantages on account of benefits like lowered warmth focus (and, by extension, lowered power consumption by cooling programs), since edge AI infrastructure doesn’t place massive numbers of AI chips in shut proximity to one another.

To offer a deeper dive into what edge AI may appear to be in apply, Information Middle Data lined one key kind of edge AI use instances: Giant Language Fashions (LLMs) deployed on the edge. By working LLMs on edge gadgets like smartphones, companies can cut back the power and compute calls for that AI locations on their knowledge facilities. At present, profiting from edge gadgets for this goal is difficult as a result of most edge {hardware} just isn’t optimized for LLMs, however that would change as chip producers design extra AI-friendly processors to be used in gadgets like smartphones.

Sensi Tech Hub
Logo