Is there a magic solution for data centers to meet the energy demands of AI?
Artificial intelligence is playing an important role in the energy consumption of data centers.
The influence of artificial intelligence (AI) on data centers and their energy consumption is undeniable, and if left unchecked, the situation could worsen. A recent report from IDC indicates that, with the significant increase in AI adoption, the energy required to support these workloads is set to rise dramatically. It is expected that electricity consumption in data centers will more than double between 2023 and 2028, with a projected growth of AI-driven workloads of an astonishing 44.7% annually until 2027, reaching an energy need of 146.2 TWh. This reality poses a serious challenge, as data centers currently account for 46% of corporate energy expenditure and could become unsustainable if the trend is not reversed.
With the constant rise in AI workloads, it is crucial for data centers to adapt quickly and manage the risk of a new energy crisis, especially in light of rising electricity prices driven by geopolitical instability in the Middle East. The demand for AI tools has spread across multiple sectors, from healthcare to financial services. However, it is alarming that an AI-powered search consumes 100 times more energy than its traditional equivalent, and that building foundational AI models requires enough energy to power 20,000 homes for six months.
Against this backdrop, a report from Atlantic Ventures, titled "Enhancing Sustainability in Data Centers 2024," proposes a solution. It suggests that next-generation data center architectures, such as hyperconverged infrastructure (HCI), can help reduce energy consumption, carbon emissions, and generate cost savings in the EMEA region. Modernizing data centers with HCI could lead to savings of up to 19 million tons of CO2 in the region in just six years, equivalent to the emissions of approximately 4.1 million cars, while also reducing expenses by €25 billion by 2030 through improved energy and operational efficiency.
By integrating AI into their operations and addressing the magnitude of energy consumption, organizations could turn to HCI to mitigate the risk of rising costs and meet sustainability goals. The key lies not exclusively in HCI but in how organizations manage and optimize the processing of AI workloads. It is vital to focus on optimizing inferencing, as while training foundational models is typically a one-off event, inferencing occurs continuously and consumes the majority of the energy.
Furthermore, to address the increasing energy demands, it is suggested that more data center providers utilize renewable sources and rethink their infrastructure. Combinations of cloud resources, edge computing, and on-premises systems offer an opportunity to balance the energy demands of AI through a more efficient distribution of workloads. For example, processing data closer to its origin using edge computing reduces the energy required to transfer large volumes of data to and from centralized servers.
Modern infrastructure is key to managing the energy demands of AI, and a contained platform that can handle both CPUs and GPUs becomes necessary for efficiently running workloads. The importance of storage is also highlighted, as AI typically deals with unstructured data. By investing in high-performance storage systems and optimized computing stacks, companies can significantly decrease the energy needed to run AI applications.
Ultimately, it is crucial to have tools that allow for the measurement and management of energy consumption. Platforms that provide real-time visibility into energy use enable data centers to optimize every stage of processing, from training to inferencing. A 10% improvement in energy efficiency can lead to significant savings.
Today, the true cost of AI is not limited to its performance and innovation; it also includes the energy required to sustain it. As organizations increase their AI initiatives, the question is not whether they can afford investment in these technologies, but whether they can handle the energy consumption they require. With a hybrid infrastructure and a focus on efficient inferencing, companies can better manage the increase in energy consumption. Ignoring this reality could leave data centers exposed to an energy crisis driven by AI.