- October 11, 2023
- allix
- Research
Artificial intelligence (AI) holds the potential to expedite coding, enhance driving safety, and streamline daily tasks, but a recent commentary in the Joule journal, penned by the creator of Digiconomist, illustrates how, with widespread adoption, it may result in a substantial energy footprint, potentially surpassing the energy demands of entire nations in the future.
As per the insights of Alex de Vries, it’s increasingly probable that the surge in demand for AI services will lead to a substantial upswing in AI-related energy consumption in the years ahead.
Since 2022, generative AI, which includes the likes of OpenAI’s ChatGPT, has experienced exponential growth. The training process for these AI models involves substantial energy consumption, as they require vast amounts of data. For instance, Hugging Face, a New York-based AI company, reported that its multilingual text-generating AI tool consumed approximately 433 megawatt-hours (MWH) during its training phase, which is enough to power 40 typical American households for a year.
AI’s energy consumption isn’t confined to the training phase. De Vries’ analysis reveals that when these tools are in operation, generating content in response to prompts, each generated text or image requires a substantial amount of computational power and, subsequently, energy. For instance, the daily energy consumption of ChatGPT could be as high as 564 MWh.
While companies worldwide are striving to enhance the efficiency of AI hardware and software to mitigate energy intensity, de Vries notes that increased efficiency often corresponds to increased demand, a concept known as Jevons’ Paradox.
For instance, Google has started integrating generative AI into its email service and is experimenting with powering its search engine using AI. With the company conducting up to 9 billion searches daily, de Vries estimates that if every Google search were to involve AI, it would necessitate approximately 29.2 terawatt-hours (TWh) of power annually, equivalent to Ireland’s yearly electricity consumption.
While this extreme scenario may not materialize in the short term due to the high costs associated with additional AI servers and supply chain bottlenecks, de Vries expects the production of AI servers to surge in the near future. By 2027, global AI-related electricity usage could rise by 85 to 134 TWh annually, based on AI server production projections.
This amount is comparable to the yearly electricity consumption of countries like the Netherlands, Argentina, and Sweden. Furthermore, improvements in AI efficiency might enable developers to repurpose computer processing chips for AI applications, potentially further boosting AI-related electricity consumption.
De Vries emphasizes, “The potential growth underscores the importance of using AI judiciously. It’s energy-intensive, and we shouldn’t deploy it in contexts where it’s unnecessary.”
Categories
- AI Education (38)
- AI in Business (64)
- AI Projects (86)
- Research (59)
- Uncategorized (1)
Other posts
- An Innovative Model Of Machine Learning Increases Reliability In Identifying Sources Of Fake News
- Research Investigates LLMs’ Effects on Human Creativity
- Meta’s Movie Gen Transforms Photos into Animated Videos
- DIY Projects Made Easy with EasyDIYandCrafts: Your One-Stop Crafting Hub
- Why Poor Data Destroys Computer Vision Models & How to Fix It
- Youtube Develops AI Tools For Music And Face Detection, And Creator Controls For Ai Training
- Research Shows Over-Reliance On AI When Making Life-Or-Death Decisions
- The Complete List of 28 US AI Startups to Earn Over $100 Million in 2024
- Keras Model
- Scientists Develop AI Solution to Prevent Power Outages
Newsletter
Get regular updates on data science, artificial intelligence, machine