
CENTER FOR DATA INNOVATION 1
Rethinking Concerns About
AI’s Energy Use
By Daniel Castro | January 29, 2024
Concerns about the energy used by digital technologies
are not new. Near the peak of the dot-com boom in the
1990s, a Forbes article lamented, “Somewhere in
America, a lump of coal is burned every time a book is
ordered online.”
1
The authors of the article, which became
widely cited in subsequent years in debates about energy
policy, estimated that “half of the electric grid will be
powering the digital-Internet economy within the next
decade.”
2
However, the estimate was wrong, with errors in
both its facts and methodology.
3
In hindsight, there is no
longer any dispute, as the International Energy Agency
(IEA) estimates that today’s data centers and data
transmission networks “each account for about 1–1.5% of
global electricity use.”
4
This mistake was not an isolated event. Numerous headlines have
appeared over the years predicting that the digital economy’s energy
footprint will balloon out of control.
5
For example, as the streaming wars
kicked off in 2019—with Apple, Disney, HBO, and others announcing video
streaming subscription services to compete with Netflix, Amazon, and
YouTube—multiple media outlets repeated claims from a French think tank
that “the emissions generated by watching 30 minutes of Netflix is the
same as driving almost 4 miles.”
6
But again, the estimate was completely
wrong (it is more like driving between 10 and 100 yards), resulting from a
mix of flawed assumptions and conversion errors, which the think tank
eventually corrected a year later.
7
With the recent surge in interest in artificial intelligence (AI), people are
once again raising questions about the energy use of an emerging
technology. In this case, critics speculate that the rapid adoption of AI