NVIDIA consensus shows AI requires a lot of energy: Investing.com by Barclays

NVIDIA consensus shows AI requires a lot of energy: Investing.com by Barclays

In a recent thematic investment report, Barclays analysts discussed the energy demands arising with the rise of artificial intelligence (AI) technologies, with a particular focus on the role of Nvidia (NASDAQ:) in this landscape.

According to analysts, the projected energy requirements associated with AI advancements underscore a key aspect of NVIDIA’s market outlook.

Barclays’ analysis shows that data centers could consume more than 9% of current US electricity demand by 2030, driven primarily by AI power requirements. Analysts said that “AI power included in the NVIDIA consensus” is one of the key factors behind this substantial energy forecast.

The report also points out that AI efficiency continues to improve with each new generation of GPU, but the size and complexity of AI models is growing rapidly. For example, the size of leading large language models (LLMs) is growing by about 3.5 times per year.

Despite these improvements, overall energy demand is set to increase due to the growing scope of AI applications. Each new generation of GPUs, such as NVIDIA’s Hopper and Blackwell series, is more energy-efficient. Still, larger and more complex AI models require substantial computational power.

“Large language models (LLMs) require a lot of computational power to achieve real-time performance,” the report reads. “The computational demands of LLMs also translate into higher energy consumption as more and more memory, accelerators, and servers are needed to fit, train, and extract inference from these models.”

“Organizations aiming to deploy LLM for real-time inference will need to address these challenges,” Barclays said.

To illustrate the scale of this energy demand, Barclays estimates that running about 8 million GPUs would require about 14.5 gigawatts of power, equivalent to about 110 terawatt-hours (TWh) of energy. This forecast assumes an 85% average load factor.

Nearly 70% of these GPUs are expected to be deployed in the US by the end of 2027, equivalent to more than 10 gigawatts and 75 TWh of AI power and energy demand in the US alone over the next three years.

“NVIDIA’s market cap suggests this is just the beginning of AI power demand deployment,” the analysts said. The chipmaker’s ongoing development and deployment of GPUs is expected to significantly increase energy consumption in data centers.

In addition, the reliance on grid power for data centers emphasizes the importance of addressing peak power demand. Data centers operate continuously, requiring a balanced power supply.

The report cites a notable statement made by OpenAI CEO Sam Altman at the Davos World Economic Forum: “I think we need more energy in the world than we’ve ever seen before… I think we still don’t understand the energy requirements of this technology.”



#

Disclaimer : The content in this article is for educational and informational purposes only.

Leave a Reply

Your email address will not be published. Required fields are marked *