GPT, known as the "Electricity Eater," consumes up to 500,000 kWh per day. Is Tesla's "electricity shortage" prophecy coming true?

Wallstreetcn
2024.03.11 03:42
portai
I'm PortAI, I can summarize articles.

The daily power consumption of ChatGPT is equivalent to more than 17,000 times the daily electricity usage of an American household. Huang Renxun: "The ultimate goal of AI is photovoltaics and energy storage!"

Recently, the public has begun to notice that the booming development of generative AI technology requires a significant amount of electricity.

According to a report by The New Yorker on the 9th local time, Alex de Vries, a data expert at the Dutch National Bank, estimated that OpenAI's chatbot consumes over 500,000 kilowatt-hours of electricity per day to process around 200 million user requests, equivalent to more than 17,000 times the daily electricity consumption of an average American household.

Alphabet-C's search engine also demands a substantial amount of electricity.

De Vries estimates that if Alphabet-C uses AGI (Artificial General Intelligence) in every search, its annual electricity consumption would increase to about 29 billion kilowatt-hours, surpassing the annual electricity consumption of many countries, including Kenya, Guatemala, and Croatia.

As AI chips continue to evolve, their power requirements are increasing.

For example, Dell's Chief Operating Officer Jeff Clarke recently revealed that NVIDIA will launch the B200 product with the "Blackwell" architecture in 2025, with power consumption potentially reaching 1000W, an increase of over 40% compared to the H100.

Media exposure to these issues echoes concerns raised by some AI industry leaders.

AI Giants Speak Out: Power Shortage Ahead!

Elon Musk recently predicted in a media interview that the AI industry will face a power shortage next year, stating that there "won't be enough power to run all the chips."

It's easy to predict that the next shortage will be in step-down transformers.

You have to power these chips. If your power supply output is 100 to 300 kilovolts, and it has to step down all the way to 0.6 volts, that's a significant step-down process.

Sam Altman, the founder of OpenAI, also expressed at the World Economic Forum:

The amount of energy we need is much more than we previously imagined.

With the rapid spread of generative AI technology, the demand for power by chips has significantly surged. Data shows that a decade ago, the power consumption of the global data center market was 10 billion watts, while today, the level of 100 billion watts is quite common.

Furthermore, according to the Uptime Institute in the United States, by 2025, the share of artificial intelligence business in global data center electricity consumption will skyrocket from 2% to 10%.

Following the current trend, de Vries predicts that by 2027, the entire AI industry will consume 85-134 terawatt-hours (1 terawatt-hour = 1 billion kilowatt-hours) of electricity annually. De Vries expressed disappointment in the lack of human learning while facing a large amount of machine learning.

"In my opinion, from a policy perspective, the only realistic measure in the short to medium term is to require companies to disclose (electricity usage data). It took quite a long time for the cryptocurrency field to reach this point, and artificial intelligence did not achieve this earlier, which disappoints me."

According to de Vries' previously compiled Bitcoin Energy Consumption Index, Bitcoin mining now consumes 145 billion kilowatt-hours of electricity annually, exceeding the entire annual electricity consumption of the Netherlands. The carbon dioxide emissions caused by power generation amount to 81 million tons, surpassing the annual emissions of countries like Morocco.

As AI power demand soars, is the energy sector ushering in a new round of opportunities?

Some strategists analyze that AI technology development favors energy assets:

"More and more people are realizing that large artificial intelligence server clusters will require a significant amount of energy, which is increasing the interest of some investors, expanding their investment scope to include energy fields such as electricity and oil and gas. Nuclear energy is also beginning to receive attention."

NVIDIA CEO Huang Renxun clearly stated in a recent public speech:

"The end goal of AI is photovoltaics and energy storage! If we only consider computers, we would need to burn the energy of 14 Earths."

Altman also expressed a similar view:

"The future of AI technology depends on energy. We need more photovoltaics and energy storage. What we need is nuclear fusion, or large-scale, extremely low-cost solar energy plus energy storage systems on an unprecedented scale."

Challenges also exist!

Taking the United States as an example, the growth of various industries such as clean energy, AI, data centers, electric vehicles, and mining has caused the previously stagnant U.S. electricity demand to "take off" again. However, even being hailed as the "largest machine" in the world, the U.S. power grid seems unable to cope with this sudden change.

Analysts point out that 70% of the U.S. power grid's access and transmission facilities are aging, with insufficient transmission lines in some areas. Therefore, the U.S. power grid needs a large-scale upgrade. Without action, the U.S. will face an insurmountable domestic supply gap by 2030.