Beyond the Energy Techlash: The Real Climate Impacts of Information Technology
Alarmists claim the tech sector’s carbon footprint is mushrooming out of control. But they wildly misrepresent the facts. Not only is the ICT sector making significant progress in decarbonizing, but ICT is also a powerful technology that enables other sectors to become more energy efficient.
The tech sector is increasingly being implicated in climate change, with critics claiming it consumes massive amounts of energy that are driving up greenhouse gas (GHG) emissions. Wild claims that have gained circulation highlight the energy and carbon footprint of information and communications technologies (ICT). One op-ed in The Guardian went so far as to write that “[to] decarbonize, we need to decomputerize.”
At the same time, energy systems are increasingly going digital, improving efficiencies, and making devices smarter and more connected. Manufacturers are integrating ICT into industrial processes to optimize production and reduce their energy use. Homeowners and commercial building managers use smart technologies and data analytics to ensure energy is consumed when and where it is needed. Intelligent transportation systems use traffic data to reduce congestion, saving commuters time on the road while also reducing fuel use and emissions.
The energy and carbon footprint of the tech sector deserves a clear-eyed discussion. That requires accurate data, both about ICT’s own energy consumption and its impacts on energy use in other sectors. Stories that overstate the impacts of ICT, often by a factor of 10 or more, don’t help. (No, YouTube views of the hit song “Despacito” do not consume more electricity than five African nations.)
Like all sectors, ICT faces challenges in its efforts to decarbonize. But it is also a powerful technology that enables other sectors to become more energy efficient.
Across the entire sector—including the data centers that store and process data, the transmission networks that transfer data through fixed or mobile networks, and the connected devices such as computers and smartphones that exchange information—ICT accounts for about 4 percent of global electricity consumption, and 1.4 percent of global carbon emissions. Like all sectors, ICT faces challenges in its efforts to decarbonize. But it is also a powerful technology that enables other sectors to become more energy efficient. When its impacts on other sectors are included, the use of ICT may yield net reductions in global energy use and carbon emissions, as digital services optimize or replace traditionally non-ICT activities. The question is not whether ICT has energy and environmental costs, but whether those costs can be mitigated, and whether they are offset by the energy and environmental benefits.
This report builds on the Information Technology and Innovation Foundation’s (ITIF’s) 2019 “Techlash” report, which identifies common misconceptions about the tech sector and digital technologies—including their environmental impact. The first section identifies incorrect and misleading statistics about the energy and carbon footprint of ICT, and provides more accurate estimates that have been gleaned from peer-reviewed literature and prominent energy agencies. The next section assesses the energy and carbon footprint of the ICT sector, and steps the sector is taking to decarbonize. The final section discusses the role of digitalization in driving energy efficiency across the buildings, manufacturing, and transportation sectors, and provides a few concrete examples (though not an exhaustive list) of how ICT is being harnessed to improve energy efficiency.
Recent articles have asserted that the tech industry will consume an increasing share of electricity, thereby accelerating climate change. One researcher from the Chinese tech company Huawei warned in 2017 that by 2025, a “tsunami of data” could drive Internet-connected devices to consume up to 20 percent of the world’s electricity, and emit up to 5.5 percent of global carbon emissions. In 2018, an article published in Nature Climate Change had the provocative title “Bitcoin emission alone could push global warming above 2°C.” After “Despacito” became the first song to surpass 5 billion YouTube views in 2018, some news outlets reported that streaming “Despacito” consumed as much energy as 5 African countries. A 2019 report from a French think tank claimed that video streaming alone generated as much GHG emissions as the entire country of Spain.
These sensational claims carry forward a tradition established when the Internet was first coming into widespread use. In 1999, Huber and Mills made a widely cited prediction based on faulty assumptions that “half of the electric grid will be powering the digital-Internet economy within the next decade.” Researchers from the Lawrence Berkeley National Laboratory determined that, as of 2014, the data centers that handle our Internet traffic account for only 1.8 percent of U.S. electricity consumption—a figure that has remained essentially flat since 2008, despite strong growth in data center services.
Getting the numbers right matters because quantitative data informs major business, policy, and personal decisions. Unfortunately, these claims have a lot of shock value, generate sensational headlines, and become widely circulated, often without being critically examined. Inaccurate and misleading numbers draw attention away from legitimate issues regarding sustainability within the ICT sector, and often ignore areas where ICT can be a solution to energy and environmental challenges. They can also generate a level of focus on the ICT sector far out of proportion to its contribution to global energy consumption and GHG emissions, potentially drawing attention and resources away from where they are most needed.
The following section identifies widely cited inaccurate and misleading statistics about ICT’s energy and environmental footprint, and provides more-accurate estimates sourced, where possible, from peer-reviewed literature and prominent energy institutions such as the International Energy Agency (IEA).
Note: The title of this section “Sorry, Wrong Number” is taken from a 2002 study of the same name, which examines widespread but inaccurate energy statistics, including the aforementioned Huber and Mills prediction about the energy consumption of the Internet.
Claim #1: Video Streaming for 30 Minutes Generates the Same Emissions as Driving 4 Miles
- Wrong number: A 2019 report by the Shift Project found that “the emissions generated by watching 30 minutes of Netflix (1.6 kg of carbon dioxide (CO2)) is the same as driving almost 4 miles.” The study found that video streaming alone was responsible for more than 300 million metric tons of CO2 in 2018, equivalent to the entire emissions from France.
- Media coverage: New York Post, CBC, Yahoo!, DW, Gizmodo, Phys.org, BigThink, The Guardian, and Thomson Reuters
- Does the claim pass the reasonableness test? No. For this to be true, video streaming (through Netflix alone) would have to consume 370 terawatt hours (TWh) of electricity per year, nearly double the electricity used by all data centers globally (205 TWh).
- A better estimate: IEA and Carbon Brief rebutted the main assumptions of the Shift Project report, finding an average emission rate of 0.04 kg CO2 per 30-minute show, about 50 times less than the Shift Project’s estimate.
When the Shift Project’s report “Climate Crisis: The Unsustainable Use of Online Video” was released in July 2019, its sensational findings quickly garnered media attention. The New York Post ran an article with the headline “Why climate change activists are coming for your binge watch,” and Thomson Reuters published an article titled “How cat videos could cause a ‘climate change nightmare.’” In all, at least eight news outlets cited the Shift Project report, adopting their figures without critical examination. Only the Gizmodo article referenced an earlier peer-reviewed study that Netflix video streaming in 2016 resulted in 11 million metric tons (Mt) of CO2, a far cry from the 300 Mt CO2 reported by the Shift Project. However, the article did not attempt to reconcile the two numbers or explain the disparate estimates.
Had these news outlets consulted the peer-reviewed literature, they could have uncovered a 2014 study authored by researchers from the Lawrence Berkeley National Laboratory and Northwestern University and published in Environmental Research Letters, a top environmental journal. That study found streaming in the United States in 2011 emitted 0.42 kg CO2/hour (or 0.21 kg CO2 for a 30-minute show) on a life cycle basis, including the embodied emissions from the manufacture and disposal of infrastructure and devices. (The Shift Project study included only the electricity-related emissions from viewing, a narrower scope than the 2014 study.) Since 2011, video streaming devices have become more energy efficient, data transmission networks have improved, and the carbon intensity of electricity has declined, so it would be reasonable to expect the emission rate to have declined between 2011 and 2018 as well.
But it appears only one outlet attempted to fact-check the Shift Project’s claim. BigThink, which first reported on the study in October 2019, posted a correction dated January 24, 2020, noting “An earlier version of this article relied on data produced by The Shift Project, as reported by the international news agency AFP. That information remains unverified, and The Shift Project has not answered BigThink’s request for verification or comment.” The article’s headline was also changed to “Online video streaming should go green, say experts.”
In March 2020, IEA analyzed Shift Project’s assumptions and produced its own estimate of the emissions impact of video streaming. According to IEA’s analysis, the Shift Project model implies that 1 hour of Netflix consumes 6.1 kilowatt hours (kWh) of electricity, enough to drive a Tesla more than 30 km. With viewers watching 165 million hours per day, this equates to 370 TWh of electricity per year—800 times larger than the total electricity consumed by Netflix (0.45 TWh in 2019), and nearly double the electricity used by all data centers globally (205 TWh in 2018).
According to IEA and Carbon Brief, the Shift Project study makes several flawed assumptions.
- Overestimating the bitrate at 24 megabits per second (Mbps). The bitrate for ultra-high definition 4K video is 16 Mbps, and the global average (weighted by viewing type) is closer to 4.1 Mbps. The error was due to confusing bits for bytes, or possibly confusing uppercase for lowercase: 1 byte equals 8 bits (1 B = 8 b). The Shift Project uses an average bitrate of 3 MBps instead of the more reasonable 3 Mbps.
- Overestimating the energy intensity of data centers by 7- to 18-fold, and data transmission networks by 6- to 17-fold. For example, the Shift Project study assumes an energy intensity of 0.9 kWh/GB, compared with IEA’s estimate of 0.1–0.2 kWh/GB for 4G mobile in 2019.
- Underestimating the energy consumed by viewing devices by 4- to 7-fold. The Shift Project study assumes that viewing occurs only on smartphones and laptops, which are less energy-intensive than televisions.
Altogether, IEA analysis found that video streaming consumes 0.12–0.24 kWh of electricity per hour of viewing—25 to 53 times less than the Shift Project estimate—though electricity consumption varies widely by viewing device (a TV consumes 100 times more electricity than a smartphone). Based on this estimate of electricity consumption, IEA found an average emission rate of 0.08 kg CO2/hr, about 50 times less than the Shift Project’s estimate. IEA even created a tool to estimate emissions based on viewing device (smartphone, tablet, laptop, or television), video definition (standard, high, or ultra-high), and the carbon intensity of the electricity mix by country.
In a post dated June 15, 2020, The Shift Project acknowledged the error regarding the bit rate, which is the largest source of discrepancy between their original emission rate and the rate identified by IEA. However, the damage may already be done, and it seems unlikely their updated estimates will generate as much press coverage or reach as many people as the initial report.
Figure 1: Electricity (left) and emissions (right) per hour of video streaming, according to the Shift Project and IEA
Claim #2: Bitcoin Emissions Alone Could Push Global Warming Above 2°C
- Wrong number: A 2018 study published in the journal Nature Climate Change found that cumulative emissions from Bitcoin mining alone could exhaust the remaining carbon budget to limit warming to 2°C.
- Media coverage: Forbes, ScienceDaily, Phys.org, Digital Information World, Government Technology, Cointelegraph, The Independent (U.K.), CBC, Thomson Reuters, MIT Technology Review, TreeHugger, and The Sociable
- Does the claim pass the reasonableness test? No. The study makes a couple of flawed assumptions, including holding the energy efficiency of bitcoin mining and the carbon intensity of electricity constant through 2100.
- A better estimate: IEA estimates that Bitcoin mining is responsible for 10-20 Mt CO2 per year, or 0.03-0.06 percent of global energy CO2 emissions.
Concerns about the energy consumption of Bitcoin mining date back to before the Nature Climate Change study came out. In December 2017, Newsweek ran an article titled “Bitcoin Mining on Track to Consume All of the World’s Energy by 2020,” and the World Economic Forum published an article titled “In 2020 Bitcoin will consume more power than the world does today.” The energy use of Bitcoin mining was even the subject of a recent Congressional Research Service report.
The Nature Climate Change study thus found fertile ground when it was first released in October 2018. Many outlets may have relied on peer review and the imprimatur of the journal as a substitute for performing their own fact checking. The findings were rebutted in a later article in the same journal, which identified several flawed assumptions. For example, the original study assumed bitcoin transaction growth rates far outside of historical trends. Bitcoin transactions have increased 1.3- to 2.3-fold per year for the past several years. But the original study assumed bitcoin transactions—which totaled 104 million in 2017—would increase to 11 billion by 2020 in the median scenario, a 108-fold increase in just 3 years.
Additionally, the original study held the electricity efficiency of bitcoin mining equipment and carbon intensity of electricity constant through 2100. Developing plausible projections is notoriously difficult, but this approach is clearly flawed, as bitcoin mining has evolved rapidly in the last decade. A key metric is the hashrate, or the number of bitcoin mining operations that can be performed each second. In a span of just a few years, state-of-the-art bitcoin mining evolved from central processing units (CPUs) in 2009, to more powerful graphics processing units (GPUs) in 2010, to field-programmable gate array (FPGA) hardware in 2011, to application-specific integrated circuits (ASICs). The latest ASICs are around 50 million times faster (hashes per second, H/s) and a million times more energy efficient (hashes per Joule, H/J) at mining bitcoin than the CPUs used in 2009 (figure 2).
Similarly, the carbon intensity of electricity generation has declined worldwide across all major regions, as the share of electricity from renewables and nuclear has increased. The global average carbon intensity (kg CO2 per kilowatt hour of electricity) declined 9 percent between 2010 and 2018, while the carbon intensity of U.S. electricity fell by 23 percent during the same period. In China, where 60 to 70 percent of bitcoin is mined, the carbon intensity fell by 18 percent.
Recent published estimates of bitcoin’s electricity consumption range from 20 to 80 TWh annually, or about 0.1 to 0.3 percent of global electricity use.
In response to the hype and the headlines, IEA posted a review of the literature, and discussed the differences in methodology across the different published estimates. IEA concluded that electricity consumption of bitcoin mining was around 45 TWh in 2018. This finding conforms with a recent study published in Joule, which finds annual electricity consumption of 45.8 TWh.
Because energy is such a huge cost of bitcoin mining, many of the large bitcoin data mining centers are fueled by cheap renewable electricity, which typically has low or zero marginal electricity costs, over higher-cost fuel-based electricity. Electricity generation in other bitcoin mining centers are also dominated by renewables. A recent report from CoinShares, an industry consortium, estimates that renewables account for 74 percent of the energy used to mine bitcoins. Based on its own analysis, IEA concluded bitcoin mining is responsible for 10 to 20 Mt CO2, or about 0.03 to 0.06 percent of global energy-related CO2 emissions.
Figure 2: Bitcoin mining efficiency
Claim #3: YouTube Views of “Despacito” Consumed As Much Electricity as Chad, Guinea-Bissau, Somalia, Sierra Leone, and the Central African Republic Combined
- Wrong number: Downloads of “Despacito,” the first song to surpass 5 billion YouTube views, consumed as much electricity as the countries of Chad, Guinea-Bissau, Somalia, Sierra Leone, and the Central African Republic put together in a single year.
- Media coverage: Financial Times, BBC, Fortune, The Guardian, and Al Jazeera
- Does the claim pass the reasonableness test? No. These countries together consumed about 1 TWh of electricity in 2017. Google’s total electricity consumption—which includes the electricity required to run its YouTube site—was 7.6 TWh in 2017. It strains credulity to assert that a single song was responsible for one-seventh of Google’s total electricity use.
- A better estimate: Ericsson, a telecommunications equipment company, estimated that a typical download of 1 song to a smartphone requires 0.001 kWh, in which case 5 billion downloads would consume 0.005 TWh, a factor of 200 less than the BBC estimate.
It is difficult to fact-check this claim, because it does not appear to be sourced from a published study. The earliest mention of the energy and emissions impacts of “Despacito” appears in a BBC article from October 2018, which references an estimate made by the Eureca project, a European think tank. Despite a lack of transparency in the underlying methodology, the claim was quickly picked up and repeated by multiple outlets. Some of the articles contacted government or university researchers or other experts, but none made an effort to fact-check the “Despacito” claim—all articles accepted the claim as fact and repeated it uncritically.
However, the claim that YouTube views of “Despacito” consumed as much electricity as five African countries is facially absurd, and is easily checked against public sources. The combined electricity consumption of these countries is about 1 TWh in a year. Five billion views of the 4 minute 42 second “Despacito” video equates to 0.4 billion hours, implying that 0.4 billion hours of YouTube viewing consumes 1 TWh of electricity. YouTube reports that it has one billion hours of videos watched daily. So for this claim to be true, YouTube viewing across all videos would have to consume 930 TWh in 1 year, more than 4 times the electricity consumed by all data centers globally (205 TWh in 2018), and 123 times more than all of the electricity consumed by Google, YouTube’s parent company (7.6 TWh in 2017). (Google reported that it purchased enough renewable energy in 2017 to match 100 percent of its global consumption, across all data centers and other operations.)
Ericsson fact-checked the “Despacito” claim in its report “A quick guide to your digital carbon footprint” and published the methodology in a companion white paper. It found that a download of 1 song to a smartphone requires about 0.001 kWh, meaning 5 billion downloads would require about 0.005 TWh, a factor of 120 times less than the estimates reported by multiple outlets.
A related claim that “the song’s carbon footprint is roughly the equivalent of the annual emissions of about 100,000 taxis” first appeared unsourced in an opinion piece in the Financial Times in March 2018. That claim was presented without evidence but was later picked up and circulated uncritically by other media. Notably, none of the articles compared YouTube views with the energy and carbon footprint of a physical CD, which is 5 to 8 times more energy and carbon intensive.
Claim #4: Training One Natural Language Processing (NLP) Model Is Equivalent to 300 Round-Trip Flights Between New York and San Francisco
- Misleading number: Using artificial intelligence (AI) to train NLP software—for example, the software that helps Amazon’s Alexa understand what you’re saying and enables machine translation between languages—generates 626,000 pounds of CO2, the equivalent of 315 round-trip flights between New York and San Francisco.
- Media coverage: MIT Technology Review, Vice, S&P Global, Forbes, New Scientist, and AI Now Institute
- Does the claim pass the reasonableness test? Uncertain. The methodology has not been fact-checked by other sources.
- A better approach: Accounting for the benefits and costs of AI in a unified framework.
This estimate of the energy and carbon footprint of AI first appeared on a preprint server on June 5, 2019, and quickly garnered media coverage. The next day, New Scientist ran the headline “Creating an AI can be five times worse for the planet than a car.” In October 2019, S&P Global, a financial services company that provides market analysis, cited the article in a blog titled “AI’s large carbon footprint poses risks for big tech.”
The study estimates the electricity needed to develop and train several off-the-shelf NLP models, and uses the average carbon intensity of electricity in the United States to develop emissions estimate.
The number itself may not be wrong, but it certainly lacks context. It is misleading to present the energy and environmental costs of AI without also identifying the benefit that AI provides. NLP has a wide range of applications: Hitachi is partnering with several U.S. cities to use NLP to analyze hundreds of data points, such as 911 call locations and geotagged social media posts, to create heat maps of areas likely to have elevated levels of criminal activity. Workforce analytics company Kanjoya used NLP to analyze language used in the workplace, such as employee performance evaluations, to identify signs of implicit gender bias, so companies can treat employees more fairly.
Even in the climate and energy context, NLP has multiple applications. Researchers have applied NLP to patent data to understand the solar panel innovation process. NLP has been used to identify climate risks and investment opportunities from public company disclosures. Researchers have identified potential applications for NLP to inform individual action. For example, apps can extract flight information from a person’s email to predict associated emissions, or analyze data from receipts, to empower customers who wish to reduce their emissions.
Looking at the energy cost of AI in isolation—without addressing the benefits—does not answer the question of whether developing an AI model makes sense.
Nonetheless, the S&P blog article makes some positive contributions, with its authors recommending that AI researchers “should prioritize computationally efficient hardware and algorithms.” A recent Science article makes a similar point, noting that future computing efficiency gains will be driven by new algorithms and software performance engineering. This research and others have launched the field of “green AI,” in which researchers prioritize the development of energy-efficient AI models. Energy-aware algorithms that report their energy consumption could help researchers prioritize algorithmic efficiency going forward.
The examples from the previous section provide snapshots of the energy demand of some digital services. This section looks at the entire ICT sector, including the data centers that store and process data, the transmission networks that transfer data through fixed or mobile networks, and the connected devices such as computers and smartphones that exchange information over a network.
Energy demand has been relatively stable in recent years, despite rapid growth in data services and connected devices, as equally rapid efficiency improvements have helped moderate the impact of ICT on energy consumption. Current published estimates find that ICT accounts for about 4 percent of global electricity demand and 1.4 percent of global GHG emissions.
But like all sectors, the ICT sector will have to do its part to address global climate change and reduce its own environmental footprint, with the ultimate goal of achieving net-zero emissions. This section examines the energy footprint of data centers, networks, and connected devices; the trends in efficiency that have kept energy growth in check; and steps the sector is taking to decarbonize.
Rapid Growth in Data, Yes. Tsunami, No. The Energy Footprint of the ICT Sector
The demand for data is growing rapidly. IEA uses the phrase “explosion of data” to capture this growth. According to IEA, global annual Internet traffic surpassed the exabyte threshold (1018 bytes) in 2001, and the zettabyte threshold (1021 bytes) in 2017. Internet traffic has tripled since 2015, and is expected to further double by 2022. The number of mobile Internet users is expected to increase from 3.6 billion in 2018 to 5 billion by 2025. The number of Internet-connected devices is expected to double from 12 billion in 2019 to 25 billion by 2025. The developing world is leading recent growth in connectivity, as more households and businesses gain access to electricity and the Internet.
This growth has led to concerns the tech sector industry will consume an increasing share of electricity, thereby accelerating climate change. One researcher from the Chinese tech company Huawei warned in 2017 that a “tsunami of data” could drive Internet-connected devices to consume up to 20 percent of the world’s electricity, and emit up to 5.5 percent of global carbon emissions, by 2025. The provocative imagery of a tsunami quickly grabbed headlines and was repeated by multiple outlets.
The energy intensity of data centers has decreased about 20 percent annually since 2010.
But while there has been and will likely continue to be rapid growth in data services and connected devices, rapid improvements in the efficiency of computing, data centers, and data transmission networks have moderated the impact of the tech sector on energy consumption (figure 3). For example, computing efficiency—the number of computations per kWh of electricity—has doubled about every 1.6 years, a phenomena known as “Koomey’s law.” The metaphor of a tsunami calls to mind the image of an overwhelming wave crashing through everything in its path. But it fails to convey the full story.
In 2010, data centers consumed 194 TWh of electricity, about 1 percent of global electricity consumption. Since then, the global installed base of servers has increased by 30 percent; compute instances have increased more than six-fold; data center IP traffic has increased by a factor of 11; and data center storage capacity has experienced a 25-fold increase.
Figure 3: Global trends in Internet traffic, data center workloads, and data center energy use, 2010–2019
However, data center energy consumption has remained roughly flat. Researchers at the Lawrence Berkeley National Laboratory, Northwestern University, and UC Santa Barbara published an updated estimate in the journal Science, finding that data centers consumed 205 TWh in 2018. Since 2010, increased server efficiencies have reduced the power per compute instance. Greater storage-drive efficiencies and densities have reduced storage energy use (kWh per terabyte) by nearly 90 percent. And power usage effectiveness (PUE)—the ratio of energy consumed by a data center across all energy needs, including cooling and power provisioning, to the energy consumed by its servers—declined by 25 percent (figure 4). The trend in declining PUE is driven partly by the shift from traditional data centers to larger and more efficient cloud and hyperscale centers, which take advantage of greater scale to reduce the share of energy going to non-server needs. Overall, the energy intensity of data centers has decreased about 20 percent annually since 2010.
Figure 4: Trends in global data center energy-use drivers
Data transmission networks are also rapidly becoming more efficient. Fixed-line (wired) network energy intensity has halved about every two years, declining from roughly 7 kWh per gigabyte (GB) in 2000 to 0.023–0.06 kWh/GB in 2015. Although less efficient than wired networks, mobile networks have also been making impressive gains, with one study finding the energy intensity of Finland mobile networks declining from 12.3 kWh/GB in 2010 to 0.3 kWh/GB in 2017—equivalent to a doubling in energy efficiency every 1.3 years. This is driven partly by the trend to more energy-efficient mobile networks. 4G networks are roughly 50 times more energy efficient than 2G networks. The energy impacts of 5G are still uncertain, though some network infrastructure operators project 5G could be 10 to 20 times more energy efficient per bit transported than 4G within the next decade. One study estimates that, because of its higher efficiencies, a faster rollout of 5G could reduce cumulative carbon emissions of transmission networks by 500 million metric tons by 2030.
Across all modes, data transmission networks consumed about 250 TWh in 2019, about 1 percent of global electricity generation. Electricity consumption for data transmission networks is projected to rise to 270 TWh in 2022, as the demand for data increases more rapidly than near-term efficiency improvements.
Past performance is no guarantee of future success, and continued improvements in efficiency may be harder to come by. The doubling in computing efficiency has slowed down in recent years, with peak efficiency doubling about every 2.6 years. Advances in parallel computing have continued to drive computing efficiency, but new sources of efficiency will have to be found as Moore’s Law, and miniaturization, reaches physical limits. A June 2020 study published in Science finds plenty of opportunity for continued efficiency in software, algorithms, and application-specific hardware, e.g., as software is engineered to minimize compute time. Quantum computing offers the ability to perform some types of calculations orders of magnitude faster than conventional computing.
Data centers have sufficient energy efficiency resources to absorb the next doubling of data services, which is likely to occur in the next three to four years. This finding comports with IEA’s projection that data center energy consumption will remain flat through 2022, even as global gross domestic product (GDP) is projected to increase 14 percent between 2019 and 2022. Efficiency standards such as ENERGY STAR, incentives to shift to cloud computing, and continued reductions in data center PUE can ensure the current efficiency resource is maximized, as new options are being developed. Investment in new technologies—including AI for data center management, ultrahigh density storage, and advanced cooling—can scale up new options so they become available as the current efficiency resource maxes out. More than 50 of the world’s largest tech companies have joined Green Grid, an industry consortium that works to improve Information Technology (IT) and data center energy efficiency by developing energy-use metrics and setting efficiency standards.
“On Track” to Decarbonize: The Carbon Footprint of the ICT Sector
Ongoing improvements in energy efficiency of data centers and networks have led IEA to designate the ICT sector as one of the few sectors that is “On track” for deep decarbonization. In June 2020, IEA released its annual Tracking Clean Energy Progress, which tracks the development and deployment of 46 key energy technologies and end-use sectors that are critical for halting global warming. “Data centers and data transmission networks” represent 1 of only 6 technologies IEA designated as “On track” to meet its Sustainable Development Scenario, based on current efficiency trends. (Within the buildings sector, building envelopes and heating are listed as “Not on track”; heat pumps, cooling, and appliances and equipment are designated as “More efforts needed”; and lighting and data centers and data transmission networks are the only technologies/sectors that are “On track.”)
The carbon footprint of the ICT sector as a whole—including data centers, data transmission networks, and connected devices—was estimated at 730 Mt CO2 in 2015, about 1.4 percent of global emissions. Connected devices such as smartphones accounted for the largest share (395 Mt CO2), followed by data networks (180 Mt CO2) and data centers (160 Mt CO2). Figure 5 compares the GHG emissions of the ICT sector to other industry sectors.
Figure 5: Global Greenhouse Gas Emissions by Industry, 2014
Many ICT companies are decarbonizing their own electricity supply faster than the grid as a whole, and tech companies are often at the forefront of commitments to purchase clean energy. Greenpeace has identified 20 Internet companies—including Facebook, Apple, and Google—that have made 100-percent-renewable-energy commitments. In 2018, Google and Apple purchased or generated enough renewable electricity to match 100 percent of their data center energy consumption. According to IEA, large tech companies have accounted for about half of global procurement of renewables in the last few years (figure 6). The top four corporate off-takers of renewables in 2019 were all ICT companies, led by Google.
Tech companies are also finding novel ways to procure clean energy. In 2017, Microsoft worked out a deal with its local electric utility and Washington state regulators to withdraw from the utility’s service territory so it could purchase cleaner electricity directly from open power markets. And in 2020, Microsoft announced a plan to be carbon negative—across its entire supply chain—by 2030, and to remove their historical emissions across the life of the company by 2050.
Figure 6: Global power purchase agreement volumes by sector, 2009–2019
Many tech companies are meeting their clean energy pledges through power purchase agreements (PPAs), in which they purchase renewable energy credits equal to their energy demand. But matching energy consumption with clean energy in real time is more difficult, as variable generation from wind and solar may not match the electricity demands of data centers. After achieving its 100 percent renewable energy pledge, Google set a new long-term goal for its data centers to be powered by 100 percent clean energy in real time. It is taking steps such as monitoring the real-time carbon intensity of electricity on the grid, so it can shift high computing needs to times when the grid is powered by cleaner electricity. Additionally, Google is supporting clean electricity that can be generated and dispatched on demand to complement intermittent generation from variable renewables such as wind and solar.
In February 2020, the ICT industry became the first to develop sectoral targets approved by the Science Based Target Initiative, which helps companies set emissions targets in line with the level of decarbonization required to keep global temperature increase below 2°C. Twenty-nine telecom companies representing 30 percent of mobile connections worldwide have committed to science-based targets, aiming to reduce emissions by 45 percent by 2030, from 2020 levels.
ICT as a Climate Solution: How the Tech Sector Can Enable Smarter Energy Use and Reduce Greenhouse Gas Emissions
ICT is at the heart of many solutions that reduce energy demand and GHG emissions—and a full assessment of the energy and climate impacts of the tech sector should include not just the energy used by ICT but the energy used and saved by ICT use in other sectors. The potential of ICT to reduce GHG emissions is increasingly acknowledged as relevant in intergovernmental policy documents. The European Commission’s recent white paper on AI finds that “AI and digitalisation are critical enablers of Europe’s Green Deal ambitions.”
The use of ICT may yield net reductions in global energy use and environmental impacts, as digital services optimize or replace traditionally non-ICT activities, such as using telework and teleconference to reduce business travel. The Global e-Sustainability Initiative, an industry association focused on meeting sustainable development goals, estimated that the ICT industry currently abates 1.5 times its own carbon footprint—and that could go up to almost 10 times by 2030.
One way ICT can reduce energy use and environmental impacts is through dematerialization—replacing physical goods and in-person activities with equivalent digital services. Email and online news sites have replaced faxes and newspapers; video streaming is replacing DVDs; and smartphone cameras are replacing chemical-based photography. Digital platforms enable telework and e-commerce, both resulting in less commuting and fewer trips to the store, and reducing fuel consumption. ITIF’s report Digital Quality of Life provides a thorough review of how ICT is saving energy and reducing environmental costs through dematerialization.
Another family of solutions involves the use of ICT to drive smarter use of energy—what the American Council for an Energy Efficient Economy (ACEEE) termed “intelligent efficiency.” Digitalization has the potential to optimize the energy used of many energy-intensive activities—from producing cement to cooling a building—while also improving environmental outcomes.
This section explores the role of digitalization in driving energy efficiency gains across the buildings, manufacturing, and transportation sectors.
Smarter Energy Use in Buildings
Residential and commercial buildings account for one-third of global energy demand and 55 percent of electricity consumption. These numbers are even higher in the United States, where buildings are the largest energy-consuming sector, consuming 71 percent of the nation’s electricity. In U.S. homes and apartments, cooling, heating, and water heating together account for nearly two-thirds of total final energy demand, with lighting, appliances and electronics, refrigeration, and clothes drying rounding out the remainder. Commercial-building energy use is more evenly split between cooling, ventilation, lighting, refrigeration, and office equipment, with each accounting for about 20 percent of the whole.
Digitalization of building energy systems and consumer devices can greatly improve the energy efficiency of buildings by ensuring energy is consumed when and where it is needed, and improving the responsiveness of energy services (e.g., lighting, air conditioning). Sensors enable homeowners and commercial building managers to predict, measure, and monitor the real-time energy performance of buildings, allowing consumers to identify where energy savings can be achieved. Active controls can optimize energy use within a building while also enabling better integration with the power grid. Benefits include lower energy costs for homeowners and building managers, greater consumer choice, improved reliability and resilience, improved integration of distributed energy resources such as rooftop solar, avoided electricity capacity buildouts, and reduced environmental impacts.
Sensors and controllers are platform technologies that can be integrated into all building energy systems to enable greater connectivity and systems-level optimizations. One study found that integrating smart sensors and controls throughout the commercial building stock has the potential to save as much as 29 percent of building energy consumption through high-performance sequencing of operations, optimizing settings based on occupancy patterns, and detecting and diagnosing inadequate equipment operation and installation problems. Smart sensors and controls can enable buildings to reduce their peak electricity load by 10 to 20 percent, e.g., by shifting some energy services to times of day when energy demand is low. The U.S. Department of Energy (DOE) estimated that sensor and control technologies alone could reduce building energy consumption in the United States by 1.7 quads in 2030, generating $18 billion in annual energy savings.
Heating, ventilation, and cooling (HVAC) is the largest energy demand in a building, typically consuming 40 percent of a commercial building’s energy. Smart HVAC systems can use sensors and active controls to optimize airflow and conditioning based on data such as occupancy, temperature, humidity, and air pressure. Variable-frequency drives attached to rooftop air conditioning units optimize the supply of airflow, which can thereby reduce electricity consumption by up to 50 percent. Smart thermostats that help households and building managers monitor and regulate heating and cooling can reduce electricity demand by 15 to 50 percent, depending on the building and control technology.
These findings concord well with a 2017 report from ACEEE that found U.S. commercial buildings could reduce their annual energy use by 6 to 40 percent, depending on the building type (figure 7). The report identifies a suite of smart technologies, the energy savings for each technology, and the associated capital costs and payback periods. The technologies ranged from smart thermostats and variable drive HVAC systems to smart plugs that turn off devices when they are no longer in use to advanced and web-based lighting controls. In many cases, the payback time for installing these technologies was on the order of 1 to 3 years.
Figure 7: U.S. commercial building subsector energy savings from smart building technologies
IEA’s 2017 report Digitalization and Energy identifies the global potential energy savings from smarter energy use in buildings. IEA projected electricity use in buildings will nearly double from 11 petawatt hours (PWh) in 2014 to around 20 PWh in 2040. But integration of ICT can moderate that growth, reducing annual electricity use by up to 4.65 PWh, or nearly 25 percent. The active controls needed to make building systems smarter would consume only 275 TWh annually, far less than the energy saved by those controls. Cumulative global electricity savings between now and 2040 is roughly 65 PWh, with the United States saving 14 PWh. The largest potential savings come from heating, cooling, and lighting, which make up more than 60 percent of total final energy demand in buildings (figure 8).
Figure 8: Cumulative (2017–2040) energy savings in buildings from widespread digitalization, by energy use
In 2019, DOE launched a new research initiative to develop grid-interactive buildings that can provide greater systems-level efficiencies and demand-management services. The initiative is based on the principle that building end uses can be dynamically managed to help meet grid needs and minimize electricity system costs, while meeting occupants’ comfort and productivity requirements. The initiative is also aimed at co-optimizing distributed energy resources such as rooftop solar, battery and thermal energy storage, and combined heat and power with building energy systems. DOE identified four demand-side management services networked and connected building energy systems can provide to the grid (figure 9):
- Efficiency—the ongoing reduction in energy use
- Load Shed—the ability to reduce electricity use for a short period of time, typically during peak demand periods or emergencies
- Load Shift—the ability to change the timing of electricity use, for reasons such as minimizing demand during peak periods, taking advantage of low electricity prices, or reducing the need for renewable curtailment
- Modulate—the ability to balance power supply/demand autonomously on a second to sub-second scale to maintain power quality (e.g., frequency)
Figure 9: Changes in building electricity demand as a result of four demand-side management tools