Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: [email protected]
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Unlock the full potential of AI with Building LLMs for Productionβ€”our 470+ page guide to mastering LLMs with practical projects and expert insights!

Publication

Quantifying the Electricity Consumption of Generative AI
Latest   Machine Learning

Quantifying the Electricity Consumption of Generative AI

Last Updated on December 21, 2023 by Editorial Team

Author(s): Prof Donal O’Mahony

Originally published on Towards AI.

Updated: December 11th, 2023 β€” Analysis revised in Appendix to reflect doubling of sales estimated by AMD

Generative AI requires Graphics Processing Units (GPUs)β€” lots of them. The computations involved in machine learning require manipulating large models with billions of parameters, and the best way to process that load efficiently is to use the thousands of parallel processors that only GPUs can provide.

The last time this surge in demand for GPU compute power happened was in cryptocurrencies β€” where Bitcoin was mined first using GPUs and later specifically designed chips (ASICs) while Ethereum continued to use vast warehouses of GPUs before eventually ditching its mining process in favor of a method called Proof-of-Stake (PoS).

Although people were aware that Bitcoin mining consumed significant amounts of power, it was not until a team of researchers from Maynooth University in IrelandΒΉ quantified the energy usage in terms that were easy to understand that the magnitude of the problem became apparent. In fact, at the time of the study , Bitcoin mining consumed roughly the same amount of electricity as Ireland did in a year. Since then, the energy consumption has risen to be on par with that of Poland.

In cryptocurrency, it was possible to quantify power consumption since the rate of currency generation was known as the so-called β€˜difficulty’ level. In the latest generative AI surge, it is harder since no such measures are available.

Generative AI Infrastructure is Growing Rapidly

There is plenty of anecdotal evidence of the rapid growth in the purchase of AI Infrastructure, including reports of Elon Musk’s X.AI company buying 10,000 Nvidia GPUs and of Inflection AI building a cluster of 22,000 (Nvidia) H100s.

There is academic analysis, too, with Alex de Vries’ paperΒ² considering possible power consumption using a bottom-up approach piecing together news stories and company releases on the topic.

The dominant vendor of GPU hardware is undoubtedly Nvidia, with competition from AMD. Both of these companies are public, and each quarter, they are required to file detailed reports telling their shareholders and the general public about their current sales and making projections for the future.

Less visible are the activities of companies whose main product is not GPUs. Google makes its own AI chip, and Microsoft has announced its intention to do the same. ARM has announced moves in this direction and there are a slew of startup companies that expect to compete in the space. We can only guess at the scale of their activities, though, and have excluded them from this analysis.

Following the Sales Projections

Nvidia, as the dominant producer of chips, is perhaps best placed to forecast future demand, and their sales estimates are worthy of scrutiny

In Nvidia’s presentation of first quarter results in May, 2023, the company reported total revenues of US$ 7.192bn of which US$4.284 bn was in the Data Center category. This reflects a very successful quarter, which the company largely attributed to Generative AI and the recent release of ChatGPT by OpenAI. They astonished the stock market by predicting that revenue attributed to the Data Center category in the full year of 2023 would reach US$15bn.

These spectacular results were amplified in quarter 2 when sales in the same category came to $10.3bn and in quarter 3 when they rose to $14.5bn. AMD’s results over the same time frame were much more muted, with just over a billion dollars in sales in each of the 3 quarters, but they expect this to change with the launch of their MI300 in quarter 4

The table below summarizes these results and with a little bit of guessing on quarter 4 reveals that approximately $50bn worth of GPUs are expected to be sold in the full year of 2023 β€” the first year after the release of ChatGPT.

Looking to the future, Nvidia’s quarter 1 report presentation(slide 42) estimated that the total future market for AI Enterprise software is $150 bn per annum (10x their 2023 estimates). This estimate was echoed in the 2nd quarter earnings call from competitor AMD who stated that they expect β€œthe market for AI accelerators to reach over $150 billion by 2027”.

We have added rows to the above table showing linear growth (in dollar terms) from $50bn in 2023 up to the predicted $150bn in 2027. If we assume that the hardware purchased in late 2023 is still operational in 2027, this means that by that time, the installed base of GPUs working on Generative AI from these two leading vendors will represent a spend of a total of $500bn.

What Does That Mean for Energy Usage

Over the 5 years, the GPUs sold will be a mix of different generations of hardware. Nvidia began 2023 by shipping H100 cards. They were sampling the newer Grace Hopper (GH200) chips in May 2023 with B100s coming sometime in 2024. Likewise, AMD will ship a whole range of Instinct processors (MI100, MI200, MI300) over the same timeline.

For our energy estimation, we are going to assume that all of GPU spend over the 5-year period is on cards that have similar properties to the Nvidia H100. This is simplistic, but it is likely that while the processing power per dollar spent and per kWh expended will rise, the same money will be spent, and similar power will be consumed.

H100 cards are priced at US$33,000, although discounting or, indeed, shortages could push them somewhat above or below that amount.

A H100 has a peak power consumption of 700 W. Operators of small clusters of these devices may have peaks and troughs as loads move between training and inference. The majority of the sales, though, will go to hyper scalers and operators of large facilities, and studies⁢ shows that, with good scheduling, these are often operated at more than 80% of capacity (and therefore maximum power consumption) all of the time.

If we now do the following simple calculations:

$1bn spending on H100s at a cost price of US$33,000 = > 30,300 units

30,300 units * 0.7Kw * (365*24)hours => 0.185 TeraWatt Hours per year

We can conclude that US$1 bn buys 30,300 GPUs which consume 0.185 TeraWatt hours per year.

The following table combines these power consumption figures with the sales estimates we outlined earlier.

Generative AI Power Consumption Estimate 2023–2027

We can see that at the end of the year 2023, those GPUs sold by Nvidia and AMD are consuming 9 TeraWatthours per year which is equivalent to the annual power consumption of Kenya using the most recent (2021) figures from the U.S. Energy Information AdministrationΒ³. By 2027, the figure has risen by 10x to be nearly on par with the consumption of Kazakhstan β€” a country of 20 million people.

Comparisons to Bitcoin Mining

For comparison, the Cambridge Bitcoin Index Website⁴ shows that Bitcoin had reached this level (93 TWh) of consumption by early January 2021. At the time of writing (December 2023), Bitcoin’s consumption had risen to 156 TWh, which is a little less than double this figure. Since Bitcoin miners have a very strong incentive to use cheap power, they often resort to using locations where energy production is β€˜stranded’, and its proponents argue that this very often means that the power production is less environmentally impactful. This is not the case for the energy used by Generative AI.

Many argue that the use of such large amounts of power for Bitcoin mining is somehow pointless and that there is very little benefit to mankind from this activity. In contrast, some studies have indicated that generative AI can boost workers' performance by as much as 40%. If this is indeed the case, then many might consider this energy consumption to be very worthwhile.

Conclusions

We have adopted a quite simplistic analysis of the present and future power consumption attributable to generative AI. We found that in 2023, the sales of GPU chips for generative AI are likely responsible for 9 TWh of electricity usage β€” equivalent to the consumption of Kenya. If the best sales predictions of industry leaders Nvidia and AMD are correct, this will rise to a consumption of 93 TWh by 2027 β€” equivalent to the consumption of Kazakhstan. This is just 60% of the consumption of Bitcoin mining today. Although the benefits of generative AI most likely will justify such consumption, it is important that we keep an eye on the progress of such large energy consumption to ensure that this is indeed the case.

Appendix β€” Revised to Include AMD Updated Forecast on December 6th, 2023

Things don’t stand still in AI for very long. On December 6th, AMD held an event entitled Advancing AI⁷ at which they launched their MI300 product. Their CEO, Dr. Lisa Su, took to the stage stating that their earlier forecasts of the Total Addressable Market (TAM) for AI Accelerators were too timid. They decided that the growth rate in this market should be more than 70% per annum, rising to $400bn in sales by 2027. This is a more than doubling of their previous estimates. We have repeated the same analysis using these new figures, and the results are presented below

Cumulative GPU Sales to 2027 β€” revised for new AMD forecast

The cumulative sales of GPUs and other AI Accelerators are now predicted to rise to over $ 1 Trillion over the 5-year period.

GPU Power Consumption to 2027 β€” revised for new AMD forecast

Translating this into power consumed, we see that it has risen to a predicted consumption of 208 TWh per year of electricity. This is close to the power consumption of a major country like Spain β€” a country that is in the top 20 economies in the world.

AMD does pride itself on power efficiency, and the product they were launching is claimed to deliver power similar to Nvidia’s H100 with twice the power efficiency. This may help the situation, but it is still likely that the chips deployed to the total addressable market will be a mix from Nvidia, AMD, Google, Microsoft, Amazon, Intel and the original basis for the estimate still stands.

References

  1. O’Dwyer, Karl J., and David Malone. β€œBitcoin mining and its energy footprint.” in 25th IET Irish Signals & Systems Conference 2014 and 2014 China-Ireland International Conference on Information and Communications Technologies (ISSC 2014/CIICT 2014), 2014 p. 280–285
  2. de Vries, A. (2023). The growing energy footprint of artificial intelligence. Joule, 7(10), 2191–2194.
  3. EIA U.S. Energy Information Administration International Energy Dashboard 2017–2021, https://www.eia.gov/international/data/world/electricity/electricity-consumption?pd=2&p=0000002&u=0&f=A&v=mapbubble&a=-&i=none&vo=value&t=C&g=00000000000000000000000000000000000000000000000001&l=249-ruvvvvvfvtvnvv1vrvvvvfvvvvvvfvvvou20evvvvvvvvvvnvvvs0008&s=1609459200000&e=1609459200000&
  4. University of Cambridge, Judge Business School, Cambridge Bitcoin Electricity Consumption Index, https://ccaf.io/cbnsi/cbeci
  5. Somers, M, How generative AI can boost highly skilled workers’ productivity, MIT Sloan School of Management, https://mitsloan.mit.edu/ideas-made-to-matter/how-generative-ai-can-boost-highly-skilled-workers-productivity
  6. Hu, Q., Sun, P., Yan, S., Wen, Y. and Zhang, T., 2021, November. Characterization and prediction of deep learning workloads in large-scale gpu datacenters. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis (pp. 1–15).
  7. Tom’s Hardware, AMD Advancing AI Event Live Blog: Instinct MI300 Launch, Ryzen 8000 β€œHawk Point” Expected, December 6th, 2023, https://www.tomshardware.com/live/news/amd-advancing-ai

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.

Published via Towards AI

Feedback ↓