The hidden energy cost of artificial intelligence

Artificial intelligence has slipped quietly into everyday life. We don’t just use AI anymore, we encounter it constantly. It appears when we search online, shop, navigate traffic, or let software autocomplete our thoughts. Even when we are not consciously engaging with AI, it is already working in the background.
There is no doubt that AI holds enormous promise. It can improve productivity, help solve complex problems, and entertain us. I am not an AI skeptic. But the speed at which AI is being adopted is a near-term consequence that deserves far more attention: electricity demand.
Data centers, the physical backbone of AI, consume vast amounts of power. According to research by the Berkeley National Laboratory, US data centers used roughly 176 terawatt-hours of electricity in 2023. That is comparable to the annual electricity use of around 16 million average American households, roughly the combined total of New York, Los Angeles, and Chicago.
What is more concerning is the trajectory. India sits at an interesting and often overlooked intersection in this conversation. It operates within many of the same technological, demographic, and governance pressures that advanced economies now face.
India has leapfrogged in digital public infrastructure, from Aadhaar and UPI to large scale digital delivery service, creating AI ready datasets and platforms at a scale unmatched by some developed nations. At the same time, its vast workforce and developmental priorities mean that AI adoption is viewed as less productivity enhancer alone and more as a tool of inclusion, efficiency.
AI expansion is naturally tempered by energy availability, but also incentivises efficiency, renewables, and smarter grid management from the outset. In effect, India is being forced to confront a question which developed economies postponed that how to align digital ambition with energy discipline rather than excess. Driven largely by AI, data center electricity use could triple within just a few years. That pace of growth is unprecedented.
Power plants and transmission lines cannot be built fast enough to keep up. Many regions already operate with thin margins during peak demand, leaving them vulnerable to shortages and price spikes. Electricity prices have been rising, and this surge in demand will only add pressure—along with increased emissions where fossil fuels remain part of the power mix. This challenge is not limited to the United States; advanced economies around the world face similar constraints.
Every interaction with an AI system consumes energy. Each prompt triggers computation and each response requires electricity. Individually, these actions seem trivial but at scale they add up quickly. Many users, and the platforms themselves, treat AI as effectively costless, encouraging unnecessary back and forth, vague queries, and redundant interactions.
This casual overuse reflects a broader blind spot in how we engage with digital tools. We rarely see the infrastructure or answers. The issue is not etiquette but efficiency, how to avoid computation by being more deliberate in when, why, and how we use AI. This is not an argument for being rude to machines, nor a plea to abandon politeness. It is an argument for efficiency.
We have learned this lesson before. Over the past several decades, households adjusted their behavior to use energy more wisely. We replaced inefficient light bulbs, turned off lights in empty rooms, and bought appliances that consumed less power. These changes did not make life worse. They saved money and reduced strain on shared infrastructure. AI now needs a similar mindset. The tech industry briefly elevated prompt engineering, the art of asking better questions, as a specialized skill. While the role itself may evolve, the principle remains sound. Clearer, more precise prompts reduce unnecessary back-and-forth and wasted computation. Most people, however, are never taught how to use AI tools efficiently, only how to use those more often. Small behavioral changes can help. When an interaction is finished, there is no need for an additional exchange that exists purely for closure.
Asking better questions upfront avoids repeated queries. Being thoughtful about when AI is actually useful prevents mindless overuse. No one needs to stop using AI. We simply need to stop using it carelessly. Responsibility also lies with the companies building and deploying these systems. Efficiency should be a core design goal, not an afterthought. Public commitments to reduce the energy intensity of AI, through more efficient models, better infrastructure, and smarter deployment, can slow the growth of data center demand and ease pressure on electricity grids.
A little restraint, practiced widely, can make a real difference. Turning off a single light seems insignificant, until millions of people do it. AI is here to stay. The question is whether we learn to live with it wisely. Politeness is a virtue. But in an energy-constrained world, thoughtful efficiency may matter even more.
Aashiva Kaul is a third-year law student at Gujarat National Law University and Secretary of its Centre for Women and Child Rights. Sanjay Kaul is Professor of Engineering Technology at Fitchburg State University, Fitchburg, MA; views are personal














