Unsustainable Growth of Generative AI Puts Environment at Risk
- themhsfreshprint
- 4 days ago
- 2 min read
Updated: 1 day ago
Written by Fiona Lee
As of 2025, it’s no secret that AI and its advancements have caused abundant controversies and uncertainties within our society. Said controversies have involved moral questioning within legalities, such as debates regarding identity theft and the loss of autonomy. However, behind the scenes of these issues, there is something deeply relevant to humans, animals, and all organisms alike that has come to be deeply affected by AI: our environment.
Generative AI platforms such as OpenAI and ChatGPT have arguably been advancing rapidly in stark comparison to the efforts to moderate them, leading to increasing concerns for the environment. According to the 2024 MIT study “The Climate and Sustainability Implications of Generative AI”, the unbridled increase in generative AI efficiency and user engagement is causing larger carbon footprints and trapped heat in the atmosphere, an excessive demand of electricity, and a drastic depletion of water and natural resources. To give a noteworthy example of this, an additional MIT study provides a compelling model comparing the average CO2 emissions of everyday life to that of the development of certain large AI models.

To put this model into perspective, one million messages sent to a Large Language Model (LLM) is equivalent to 11,001 miles driven in a gas-powered vehicle, 4.3 acres of carbon absorbed by U.S. forests yearly, and about 349,258 smartphones charged according to an Arbor blog.
While these data may be daunting, we are not powerless during this new age of artificial intelligence. Rest assured, certain AI companies such as Google are working hard to make their technologies not only safer but better for the climate. In the meantime, consider educating those around you and spreading awareness of the current environmental effects of AI. Engage in more critical thinking by avoiding generative AI for things you can simply google or research yourself.
Comments