The Sustainability of the AI Revolution
Artificial intelligence (AI) is radically changing various industries and aspects of people’s lives. As AI models continuously get more advanced, they will put an increasingly vast strain on the energy crucial for them to function. Already, data centers are producing more emissions as a direct result of the expansion of AI (Stackpole, 2025). Furthermore, it is predicted that the energy needs and usage for data centers will double between 2022 and 2026. All of this is occurring amidst an energy crisis when much of the Western World is shifting over to “cleaner” energy sources (Çam et al., 2024). However, not all AI models are made equal in terms of their efficiency or energy usage, and solutions are being drawn up to prevent and rectify looming issues.
AI represents a relatively small percentage of global energy usage as of now, representing 1-2%. However, this figure is predicted to increase significantly, with estimates reaching 21% by 2030. This is largely fueled by newer AI models, such as OpenAI’s popular Generative Pre-trained Transform-4, colloquially known as Chat-GPT-4, which uses more energy than earlier, more rudimentary iterations, such as GPT-3 (Stackpole, 2025). Currently, many AI models, including GPT-4, are shifting to expand their image and video generation capabilities. To generate a single image, these AI models require approximately the same amount of energy required as fully charging a smartphone. Earlier generations of AI could only generate text, which is significantly less energy-intensive. Generating “text 1,000 times only uses as much energy as 16% of a full smartphone charge” (Heikkilä, 2023).
Additionally, assuming current AI trends continue, Nvidia expects to be shipping more than 1.5 million of their “AI server units,” i.e., chips specially designed for AI processing, annually by 2027. Made up of multiple computer chips and semiconductors, these 1.5 million servers will use 85.4 TWh of energy annually, exceeding the energy usage of entire countries. Recently, search engines have begun to incorporate generative AI into their search to answer queries. With billions of complex parameters, if Google were to utilize and incorporate generative AI to help answer all nine billion daily queries, as they have begun to do on a smaller scale with their AI model Gemini, they would use more energy than the entire country of Ireland (Leffer, 2023).
This dramatic increase in energy usage not only forces data centers to spend more on energy but also forces them to use vast amounts of water, which is becoming increasingly scarce, to prevent overheating. For example, there are more than 200 million requests daily for ChatGPT alone. This requires 500,000 kW of energy, which is equivalent to the daily energy usage of around 180,000 American homes. To keep everything running, servers need to be cooled with water. Each kilowatt of energy used requires 12 L (over three gallons) of water (Gordon, 2024). Data center costs will increase by 50% just to pay for cooling. With other compounding costs, some higher estimates indicate that data center costs will increase tenfold in the coming years due to AI (Leffer, 2023).
Recently, there has been a plethora of new AI models, each of which is different and uses varying amounts of energy. One AI model that has garnered a lot of attention recently is the Chinese-owned DeepSeek, which has been controversial for its inability to answer questions that might embarrass the Chinese government. However, it is unique for multiple other reasons. Many indications show that DeepSeek is significantly cheaper than most of its competitors since it uses smaller chips and processors, which are older and less expensive than the larger processors used by companies such as OpenAI. Despite using cheaper processors, DeepSeek has been demonstrated to generate high-quality answers on par with those of its counterparts (Laws, 2025). Some have suggested that DeepSeek AI uses energy more efficiently during the training and inference processes. However, recent data collected by researchers at the Massachusetts Institute of Technology (MIT) has found that the truth is more complicated. When prompted by researchers to generate a response about whether or not it was morally permissible to ever lie, DeepSeek responded with a thousand-word essay, consuming 17,800 joules, which is about the same amount required to watch a 10-minute YouTube video. The study mainly compared DeepSeek with Meta’s Llama 3.1 model, finding that, for the prompt about lying, Meta used 41% less energy. The energy usage gap between the two AI models got starker as researchers continued drilling with 40 separate prompts ranging in complexity. When the study concluded, it was found that, on average, Meta AI used only 512 joules per response, while DeepSeek consistently used upwards of 87% more, albeit usually producing longer responses (O’Donnell, 2025).
In order to save money and resources, extensive research on solutions has been conducted and is being implemented by AI experts. Dr. Vijay Gadepally, a researcher at MIT’s Lincoln Laboratory, has opined that there are many solutions that companies will largely be amenable to since “there isn’t a huge [capital expenditure] investment you need to cut down on energy emissions” (qtd. in Stackpole, 2025). One potential solution is to use older analog hardware, which, although efficient energy-wise, is inaccurate for more intricate needs. Another option is to invent and use more up-to-date, efficient hardware, including photonics, which transfers data encoded on light, and 3D stacking models, which are built to be able to store more data more efficiently than their thinner, “planar” counterparts (Bourzac, 2024). Additionally, cutting-edge AI models made at the University of Virginia have incorporated a graph neural network, which systematically integrates low (which is cheaper) and high-fidelity data (which is often more accurate). This allows them to make an AI model that is faster and more energy and cost efficient since it is not solely reliant on expensive high-fidelity data (University of Virginia Engineering, 2024). This shift from more archaic AI technology to more cutting-edge technology will be a relatively large investment upfront, but it has been compared to the replacement of incandescent light bulbs with light-emitting diode (LED) bulbs, which was more expensive but saved energy costs (Stackpole, 2025).
Another fix has been in the works at MIT and other universities, where researchers have tested out limiting the total power base of around 315 watts, i.e., not operating on 100% power. They have reduced the power available to their “training and inferencing” graphic processing units (GPUs) and processors to 150 to 250 watts, which is “about 60% to 80%” of full capacity,” yielding more efficient AI systems. Lowering the power base has lowered the temperature of the processors, thus lessening the need for cooling, a significant cost. An even more complicated method to save power has been used by researchers for “drug discovery” AI models. They have designed a “speed estimation tool” to be used during training, “allowing them to predict end-state accuracy” and make decisions about what redundant and unnecessary data to cut after only 20% of computation (Stackpole, 2025). According to Gadepally, this allows them to render their AI models to be significantly more efficient and potentially “with no impact to the end model” (qtd. in Stackpole, 2025).
Artificial intelligence is a complex, burgeoning industry that undergoes constant scrutiny aimed towards improvement by some of the greatest scientific minds of today. Not only has it posed ethical issues and economic questions, but it has and will continue to pose logistical energy challenges. For AI to remain profitable and serve humanity, companies must work to find and implement solutions to fix their energy sustainability project.
References
Bourzac, K. (/2024, November 25). Fixing AI's energy crisis. Nature. Retrieved March 3, 2025, from https://www.nature.com/articles/d41586-024-03408-z.
Çam, E, et al. (2024). Electricity 2024: analysis and forecast to 2026. International Energy Agency. Retrieved March 4, 2025, from https://iea.blob.core.windows.net/assets/ 6b2fd954-2017-408e-bf08-952fdd62118a/Electricity2024-Analysisandforecastto2026.pdf
Gordon, C. (2024, March 12). ChatGPT and generative AI innovations are creating sustainability havoc. Forbes. Retrieved March 3, 2025, from https://www.forbes.com/sites/cindygordon/2024/03/12/chatgpt-and-generative-ai-innovations-are-creating-sustainability-havoc/.
Heikkilä, M. (2023, December 1). Making an image with generative AI uses as much energy as charging your phone. MIT Technology Review. Retrieved March 3, 2025, from https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/.
Laws, L. (2025, February 6). Why DeepSeek could be good news for energy consumption. University of Indiana Urbana-Champaign. Retrieved March 3, 2025, from https://grainger.illinois.edu/news/stories/73489.
Leffer, L. (2023, October 13). The AI Boom Could Use a Shocking Amount of Electricity (S. Bushwick, Ed.). Scientific American. Retrieved March 3, 2025, from https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity/.
New AI model could make power grids more reliable amid rising renewable energy use. (2024, October 24). University of Virginia Engineering. Retrieved March 3, 2025, from https://engineering.virginia.edu/news-events/news/new-ai-model-could-make-power-grids-more-reliable-amid-rising-renewable-energy-use.
O'Donnell, J. (2025, January 31). DeepSeek might not be such good news for energy after all. MIT Technology Review. Retrieved March 3, 2025, from https://www.technologyreview.com/2025/01/31/1110776/deepseek-might-not-be-such-good-news-for-energy-after-all/
Stackpole, B. (2025, January 7). AI has high data center energy costs — but there are solutions.
MIT Sloan School of Management. https://mitsloan.mit.edu/ideas-made-to-matter/
Ai-has-high-data-center-energy-costs-there-are-solutions.
Supercomputing [Photo]. National Aeronautics and Space Administration. https://www.nasa.gov/ames/core-area-of-expertise-supercomputing/.