This audio was created using Microsoft Azure Speech Services
Never has there been so much speculation for the advancement and permeation of AI into everything we do—work, learn, play, travel and even into the decarbonization of our industries. As AI moves from science fiction to reality, a significant transformation of systems and processes in all industries will occur in this latest, and potentially great, technology wave.
The difference between AI and the last great technology wave—the Internet—is that the Internet was about communicating. It connected people—to information and to technology that leveraged content. The AI wave is more about machines taking action—creating, predicting, automating and optimizing.
We are embarking on a step change in what’s possible at a previously unthinkable pace. But for all the hype, the reality is that for AI to become more powerful, corresponding computational power and network speed and capacity must be deployed. However, I am seeing many AI pundits claiming that the data center architecture needed is widespread and available. Sorry to disappoint you but that simply is not the case. For the step change to occur, substantial, dedicated data center capacity will be needed with much more at the edge—closer to the users and data.
Many trends, both macro and micro, are affecting the data center industry and the ability to support AI as the world evolves and we usher in 2024. Let’s take a look at some of them.
Sustainable operations
In 2024, I see sustainable operations as a core requirement for new capacity. To secure permitting for data centers, many countries or local jurisdictions require that data center designs be as efficient as possible, use water conservatively and operate off renewable utility sources. Restrictive countries like Germany are also requiring a certain amount of heat that is generated be reused constructively.
More collaboration with local utilities
One of the top concerns of adding data center capacity is the perceived impending shortage of utility power. With data center capacity seemingly growing at a much faster rate than utility power is added, many renewable projects have been delayed due to supply chain, legal, cost and capacity of installation equipment issues. However, adding data center capacity does not have to be a one-to-one relationship with adding utility power. A potential shortage of utility power is only expected during peak use times. I see 2024 as the year when data center operators will collaborate more with their local utilities. Large data centers have multiple utility feeds and long backup power capacity. By cooperating with electric utilities, data center operators can offload a significant portion of demand during high-demand electric heat or cooling times.
More distributed renewable power
This will also be the year when governments around the world figure out a better way to bring more distributed renewable power onto the grid. A main barrier is that most countries use processes and applications designed for one or two grid connections per year and now the requests are for hundreds or thousands of distributed, renewable connections resulting in long connection queues (many years).
Once we have this renewable power on the grid, most data center operators will have to adjust to the reality that renewable power has intermittent capacity related to when the wind is blowing or the sun is shining. The utilities will try to normalize this concern with grid storage batteries. However, critical facilities like data centers will need electricity service levels that are much more stable. I predict this will be the year when data center operators start adding multiple hours of on-site energy storage. This onsite storage will benefit not only in resilience but also at times when there is a surplus renewable supply that can be stored by data center operators and not curtailed. This renewable energy storage can be used vs carbon-emitting generators.
Proliferation of edge computing and liquid cooling
Finally, if you have been around the industry for a while, you may have heard of the impending proliferation of edge computing and also liquid cooling. I believe 2024 will be the year where real catalysts exist for both! For liquid cooling, there will be servers with up to 16 GPUs, 2 CPUs and many DPUs for AI applications. Server manufacturers are installing input and output piping for liquid because these servers simply generate too much heat to be efficiently rejected by fans and air. Edge computing will see a boost from the need for AI models close to the user and data called edge AI. This will be a year of workflow automation where AI models at the edge will work to make existing systems and processes more efficient for many industries—transport, manufacturing and medical for example.
Data centers are the enablers for AI’s promise
The promise of AI is well documented and data center capacity is indeed the enabler of that promise. This will be an exciting year when we will see if data centers rise to meet the AI challenge—I believe they will.
This article was previously published in Forbes.
Add a comment