The Energy Cost of the Internet: What the Numbers Actually Say
February 26, 2026
How much energy does the internet use? You’ve probably seen headlines that say it’s more than aviation, or that a single Google search uses enough power to boil a kettle, or that Bitcoin could save the grid. The real picture is messier—and the numbers depend on what you count and how you measure. Here’s what the data actually says.
What We’re Actually Measuring
The “internet” isn’t one thing. It’s data centers, transmission networks, and the devices we use. When people talk about the energy cost of the internet, they usually mean one or more of: (1) data centers that run the cloud, big apps, and streaming; (2) the networks—fiber, cellular, undersea cables, and the gear that moves data; (3) end-user devices—phones, laptops, routers, and TVs. Studies often focus on data centers because they’re easier to measure and they’re growing fast. But the full footprint includes networks and devices too, and those can add a lot.
Globally, data centers are estimated to use around 1–2% of total electricity. That share has stayed roughly stable for years despite huge growth in traffic, because efficiency gains—better chips, better cooling, consolidation into hyperscale facilities—have kept pace. So “the internet” in the sense of “the cloud” is a meaningful but not runaway slice of global demand. Add networks and the energy to make and run our devices, and the total digital footprint is larger, but still in the single-digit percentage range of global electricity. The exact number depends on boundaries and assumptions—there’s no single agreed “energy cost of the internet.”

Data Centers: Where the Growth Is
Data centers are the most visible part of the story. They run search, email, social media, streaming, and enterprise software. Big operators—Amazon, Google, Microsoft, Meta, and others—publish sustainability reports and have committed to renewable energy and efficiency targets. So the “cost” of a search or a video stream isn’t a fixed number; it depends on the region, the time of day, the mix of renewables on the grid, and how the provider reports. A “Google search” in a data center running on solar at noon is not the same as one in a coal-heavy grid at night.
What we do know is that aggregate data center electricity use has grown, but not as wildly as traffic. That’s because efficiency has improved: more work per watt, better cooling, and consolidation into fewer, bigger facilities. The worry for the future is AI and large-scale machine learning. Training and running big models use a lot of power, and that’s a new and growing slice of data center demand. So the “energy cost of the internet” is going up in part because of AI, even if traditional web and streaming have become more efficient per unit of work.

Networks and Devices
Networks—the cables, switches, cell towers, and routers that move data—also consume energy. That’s harder to attribute to “one search” or “one video,” but it’s part of the total. So are the devices we use. Your phone, laptop, and TV use power when you’re online; manufacturing them has a carbon footprint too. So the full “cost” of using the internet includes not just the server that answered your request, but the path the data took and the device you used. Most headline numbers ignore that and focus on data centers alone.
That said, for many daily activities—streaming, browsing, messaging—the data center portion is still a big share of the direct energy use. So “what the numbers actually say” is: data centers are on the order of 1–2% of global electricity and have been relatively stable as a share; networks and devices add more; and there’s no single “energy cost of a search” or “energy cost of the internet” without defining the boundaries. The numbers are real, but they’re easy to misuse or oversimplify.
The “One Search” Myth
You may have seen claims that “a Google search uses as much energy as boiling a kettle” or similar. Those numbers are outdated or wrong. Early back-of-the-envelope estimates divided total data center energy by total queries and got a tiny number per search—on the order of a fraction of a joule. The “boiling a kettle” idea came from a misreading or a bad extrapolation and has been debunked. A single search is a tiny fraction of data center load; the real cost is spread across millions of queries and shared infrastructure. So don’t feel guilty about searching—the meaningful levers are systemic (how data centers are built and powered), not per-query.
Crypto and AI: The New Heavy Users
Two sectors have drawn attention for energy use: cryptocurrency (especially proof-of-work mining) and AI. Bitcoin and similar networks use a lot of electricity by design; estimates put global Bitcoin mining at a fraction of a percent of world electricity, but it’s concentrated and growing in some regions. AI training and inference are becoming a larger share of data center load. So when people say “the internet” or “tech” is using more energy, a lot of the growth is from these two areas, not from your email or Netflix alone. That doesn’t mean the rest is free—but it does mean the story is more nuanced than “the internet is eating the grid.”

What Matters for You
If you care about your own impact, the levers are: how much you stream (especially in high definition), how often you replace devices, and whether the services you use run on grids with a lot of renewables. But the big picture is systemic: the internet’s energy cost is dominated by a relatively small number of large data centers and a few tech companies. Individual choices matter at the margin; policy and industry decisions matter more for where the numbers go next. The numbers say the internet is a real and growing part of global energy use—but they don’t say one search or one email is “x” joules without a lot of caveats. Understanding that is the first step to talking about it honestly.