- October 11, 2023
Artificial intelligence (AI) holds the potential to expedite coding, enhance driving safety, and streamline daily tasks, but a recent commentary in the Joule journal, penned by the creator of Digiconomist, illustrates how, with widespread adoption, it may result in a substantial energy footprint, potentially surpassing the energy demands of entire nations in the future.
As per the insights of Alex de Vries, it’s increasingly probable that the surge in demand for AI services will lead to a substantial upswing in AI-related energy consumption in the years ahead.
Since 2022, generative AI, which includes the likes of OpenAI’s ChatGPT, has experienced exponential growth. The training process for these AI models involves substantial energy consumption, as they require vast amounts of data. For instance, Hugging Face, a New York-based AI company, reported that its multilingual text-generating AI tool consumed approximately 433 megawatt-hours (MWH) during its training phase, which is enough to power 40 typical American households for a year.
AI’s energy consumption isn’t confined to the training phase. De Vries’ analysis reveals that when these tools are in operation, generating content in response to prompts, each generated text or image requires a substantial amount of computational power and, subsequently, energy. For instance, the daily energy consumption of ChatGPT could be as high as 564 MWh.
While companies worldwide are striving to enhance the efficiency of AI hardware and software to mitigate energy intensity, de Vries notes that increased efficiency often corresponds to increased demand, a concept known as Jevons’ Paradox.
For instance, Google has started integrating generative AI into its email service and is experimenting with powering its search engine using AI. With the company conducting up to 9 billion searches daily, de Vries estimates that if every Google search were to involve AI, it would necessitate approximately 29.2 terawatt-hours (TWh) of power annually, equivalent to Ireland’s yearly electricity consumption.
While this extreme scenario may not materialize in the short term due to the high costs associated with additional AI servers and supply chain bottlenecks, de Vries expects the production of AI servers to surge in the near future. By 2027, global AI-related electricity usage could rise by 85 to 134 TWh annually, based on AI server production projections.
This amount is comparable to the yearly electricity consumption of countries like the Netherlands, Argentina, and Sweden. Furthermore, improvements in AI efficiency might enable developers to repurpose computer processing chips for AI applications, potentially further boosting AI-related electricity consumption.
De Vries emphasizes, “The potential growth underscores the importance of using AI judiciously. It’s energy-intensive, and we shouldn’t deploy it in contexts where it’s unnecessary.”
- PyTorch vs. TensorFlow Frameworks
- Scientists Create Artificial Intelligence Model for Forecasting Stock Market Movements
- GitLab Improves AI Offerings with Duo Chat
- DeepMind’s System Delivers 10-Day Weather Predictions in Just One Minute
- AI Technology Empower Users to Choose Their Preferred Sounds in Noise-canceling Headphone
- Recommendation Algorithms
- Samsung Introduces Samsung Gauss, a Text, Code, and Image Generation Alternative to ChatGPT
- OpenAI Introduces GPT-4 Turbo and Fine-Tuning Initiative for GPT-4
- AI-Enhanced Customer Service
- Elon Musk’s xAI Set to Debut Its First AI Model for a Select Audience
Get regular updates on data science, artificial intelligence, machine