
Welcome to
ONLiNE UPSC
AI consumes a significant amount of electricity, with estimates suggesting that data centers worldwide, which are necessary to train and power AI systems, currently account for 1 to 1.5% of global electricity use. For example, training a single AI model can consume as much electricity as 17 US homes in a single year.
The training phase of AI is more energy-intensive than the inference phase, which is where the model is deployed to users. For instance, training a large language model like ChatGPT can consume over 1,000,000 kilowatt-hours (kWh) of electricity, equivalent to flying a plane from New York to London and back.
Estimates suggest that by 2027, the AI sector could consume between 85 to 134 terawatt-hours (TWh) each year, which is equivalent to the annual electricity consumption of a small country like Belgium.
Some experts believe that the production of renewable energy will not be able to keep up with the increasing demand for electricity to power AI systems, potentially leading to increased greenhouse gas emissions and contributing to climate change.
Several strategies can be employed to reduce the carbon footprint of AI, including:
For example, Google has pledged to power 100% of its data centers with renewable energy by 2030.
Kutos : AI Assistant!