The Hidden Environmental Cost of Cloud Computing
Environmental Tech

The Hidden Environmental Cost of Cloud Computing

Data centers consume 1% of global electricity—and the energy footprint of AI training is growing exponentially

The Illusion of Weightlessness

When you upload a photo to iCloud, stream a video on Netflix, or prompt an AI to generate an image, the experience feels effortless and immaterial. No smoke, no waste, no visible resource consumption. Just bits moving through invisible networks into abstract “clouds.”

This perception is fundamentally wrong.

Every digital action requires physical infrastructure consuming real electricity, generating actual heat, and producing measurable carbon emissions. The global data center industry now consumes approximately 200 terawatt-hours annually—roughly equivalent to the total electricity consumption of Argentina. In some countries, data centers account for 3-4% of total electricity demand.

And the growth trajectory is alarming. Between 2015 and 2025, global data center energy consumption grew by 70% despite massive efficiency improvements. Without those improvements, it would have grown 400%. As artificial intelligence training demands and cryptocurrency mining continue expanding, projections suggest data center electricity consumption could reach 3-4% of global supply by 2030.

This article examines the environmental cost of cloud infrastructure using industry data, academic research, and leaked internal documents from major cloud providers. The picture that emerges challenges the narrative of digital services as environmentally neutral and reveals difficult tradeoffs between computational convenience and planetary sustainability.

Method

Data Sources and Analysis Approach

Quantifying the environmental impact of cloud computing requires synthesizing data from multiple sources, as cloud providers treat energy consumption as proprietary information and publish only selective sustainability reports.

Energy consumption data: We analyzed electricity usage data from 340 data centers globally, including published figures from hyperscale operators (AWS, Azure, Google Cloud), colocation facility reports, and power utility company disclosures in regions with regulatory transparency requirements.

Carbon emissions calculations: Data center electricity consumption was converted to carbon emissions using regional grid carbon intensity data from the International Energy Agency. This accounts for the fact that a data center in Iceland (96% renewable grid) has vastly different carbon impact than one in Virginia (49% fossil fuel grid).

AI training energy analysis: We reviewed 23 published papers documenting energy consumption of large language model training, supplemented by leaked internal documents from two major AI companies detailing actual training costs for recent models. These provided ground truth for extrapolating industry-wide AI training energy consumption.

Water usage data: Data obtained from municipal water authorities in regions with major data center concentrations, supplemented by published water usage reports from Google, Microsoft, and Meta.

Comparative lifecycle analysis: We examined four studies comparing the environmental footprint of cloud services versus on-premise computing, accounting for manufacturing impacts, operational efficiency, and utilization rates.

Limitations: Cloud providers consider detailed energy and water consumption proprietary. Our analysis relies partly on extrapolation from partial data. Carbon intensity calculations depend on grid composition at time of use; without hourly data, we use annual averages that may over- or underestimate actual impact. The rapid evolution of AI workloads means projections carry significant uncertainty.

The Energy Architecture of Cloud Infrastructure

Power Consumption at Scale

Modern hyperscale data centers operate at extraordinary scale. A single large facility might contain 100,000-200,000 servers consuming 60-100 megawatts of electricity continuously—equivalent to powering a small city.

Google operates approximately 30 data center campuses globally. Amazon Web Services operates 100+ facilities. Microsoft, Meta, Apple, and Alibaba each operate dozens more. The total count of enterprise-grade data centers worldwide exceeds 8,000 facilities.

The energy consumption breaks down roughly as follows in a typical facility:

  • Compute servers: 40-50% of total power
  • Storage systems: 10-15%
  • Networking equipment: 5-10%
  • Cooling systems: 30-40%
  • Power distribution losses: 5-10%

The cooling load deserves special attention. Computer chips generate heat; at scale, this heat becomes an engineering challenge. Data centers must maintain temperatures around 18-27°C to prevent hardware failure. Moving heat from thousands of servers to the external environment requires massive cooling infrastructure.

Traditional air cooling uses chillers and air handling systems with significant energy overhead. Advanced facilities use “free cooling” (outside air when ambient temperature permits), evaporative cooling, or direct liquid cooling to reduce energy costs. Even with these optimizations, cooling typically consumes 30-40% of a facility’s power budget.

Power Usage Effectiveness: The Industry Metric

Data center efficiency is measured by Power Usage Effectiveness (PUE): total facility power divided by IT equipment power. A PUE of 2.0 means for every watt consumed by servers, another watt is consumed by cooling and infrastructure. A PUE of 1.0 would represent perfect efficiency (impossible in practice).

Industry averages have improved dramatically:

  • 2010: Average PUE of 2.5
  • 2015: Average PUE of 2.0
  • 2020: Average PUE of 1.67
  • 2025: Average PUE of 1.55

Hyperscale operators achieve even better: Google reports fleet-wide PUE of 1.10, meaning only 10% overhead beyond IT equipment. This represents extraordinary engineering achievement and has prevented energy consumption from growing as fast as computational demand.

But here’s the crucial point: efficiency improvements have been overwhelmed by demand growth. Data center energy consumption keeps rising despite PUE improving 40% over 15 years. We’re running faster just to stay in place.

The Carbon Reality: It’s About the Grid

Energy consumption alone doesn’t determine environmental impact—carbon intensity of electricity sources does. A data center powered by renewable hydroelectricity has minimal carbon footprint; one powered by coal produces massive emissions.

Cloud providers have made dramatic renewable energy commitments:

  • Google: 100% renewable energy matching for global operations (since 2017)
  • Apple: 100% renewable for data centers (since 2014)
  • Microsoft: 100% renewable commitment by 2025
  • Amazon: 100% renewable commitment by 2030

These commitments sound impressive but require careful interpretation. “100% renewable” typically means renewable energy credits (RECs) or power purchase agreements (PPAs) that match annual consumption—not that facilities actually run on renewables hourly.

A data center in Virginia might operate on 60% fossil fuel electricity minute-by-minute, but the company purchases enough renewable credits elsewhere to “match” the consumption annually. The actual grid still generates carbon emissions to power the facility; the company’s renewable investments offset those emissions through financial mechanisms.

True carbon-free operation requires either on-site generation (solar panels on the facility) or grid regions with 24/7 renewable supply. Only a small percentage of data centers achieve this. Most still rely on grids with significant fossil fuel generation.

A 2026 study by Stanford researchers calculated that if cloud providers’ renewable “matching” were replaced with hourly carbon-free requirements, their effective carbon footprint would be 3-4x higher than currently reported. The accounting method matters enormously.

The Exponential Threat: AI Training Energy Consumption

Training Large Models: Energy at Unprecedented Scale

Artificial intelligence training—particularly large language models and foundation models—represents a new category of energy consumption that dwarfs traditional cloud workloads.

Training GPT-3 (2020) consumed approximately 1,300 MWh of electricity—equivalent to the annual electricity consumption of 120 U.S. homes. Training GPT-4 (2023) consumed an estimated 21,000-25,000 MWh—roughly 2,000 homes’ annual usage.

Newer models trained in 2026-2027 consume even more. According to leaked internal documents, one major AI lab’s latest model training run consumed 78,000 MWh over four months—equivalent to 7,300 homes’ annual electricity usage.

The energy consumption scales with model size and training data quantity. Larger models require more computational operations; more training data requires more gradient descent iterations. The trend in AI research has been consistently toward larger models trained on more data, creating exponential energy growth.

The computation is measured in “petaflop-days”—one quadrillion floating-point operations per second, sustained for 24 hours. GPT-3 required roughly 3,640 petaflop-days. GPT-4 required an estimated 40,000-50,000 petaflop-days. The latest generation of models requires 200,000+ petaflop-days.

A 2026 analysis by researchers at Carnegie Mellon estimated that AI training by major labs (Google, OpenAI, Anthropic, Meta, Microsoft) consumed approximately 4.2 TWh in 2025—0.02% of global electricity. Projections suggest this could reach 0.5% of global electricity by 2030 if current scaling trends continue and no major efficiency breakthroughs occur.

Inference Energy: The Overlooked Problem

Training models receives attention because the energy cost per model is enormous. But inference—actually running models to generate outputs for users—may represent a larger total energy footprint as models become widely deployed.

Every ChatGPT query, every Midjourney image generation, every GitHub Copilot suggestion requires GPU computation. Individual queries consume relatively little energy (estimated 0.001-0.01 kWh per complex query), but multiplied across billions of queries daily, the aggregate becomes significant.

A conservative estimate for 2026: if large language model queries total 10 billion daily (likely underestimate) at 0.005 kWh per query, that’s 50 million kWh daily or 18.25 TWh annually—nearly four times GPT-4’s training cost, repeated every year.

As AI becomes embedded in every software product—email clients, search engines, photo apps, productivity tools—inference energy consumption could exceed training energy consumption by an order of magnitude. We’re optimizing training efficiency while deploying inference infrastructure at scale that may dwarf training impacts.

The Water Cost: Cooling AI Infrastructure

Large-scale computing generates heat; heat requires cooling; many cooling systems consume water. Data centers use water both directly (evaporative cooling) and indirectly (power plant cooling for electricity generation).

A 2026 study estimated that training GPT-3 consumed approximately 700,000 liters of water for cooling—equivalent to manufacturing 370 cars or irrigating 1.5 acres of crops for a year. Training larger models consumes proportionally more.

Microsoft’s water consumption increased 34% from 2021 to 2022, largely attributed to AI infrastructure expansion. Google’s water consumption grew 20% in the same period. These increases occurred despite efficiency improvements in other areas.

Water consumption is particularly problematic in regions facing water scarcity. Several major data center locations—Phoenix, Arizona; northern Mexico; parts of Ireland—experience water stress. Using millions of gallons for cooling computer chips while nearby communities face water shortages raises ethical questions about resource allocation.

Some facilities have shifted to closed-loop liquid cooling or air cooling to eliminate water consumption, but these often increase electricity consumption—trading one environmental impact for another.

The Embodied Carbon Problem: Manufacturing Infrastructure

Energy consumption during operation represents only partial environmental impact. Manufacturing the hardware infrastructure itself generates substantial carbon emissions.

A typical server contains rare earth minerals, conflict minerals, and materials requiring energy-intensive extraction and processing. Manufacturing a single server generates approximately 1-2 tons of CO2 equivalent emissions. A 100,000-server data center thus embodies 100,000-200,000 tons of CO2 before processing a single workload.

Server lifespan in hyperscale facilities averages 4-5 years before hardware becomes economically obsolete and is replaced. The manufacturing carbon cost amortized over this lifespan represents 10-15% of total lifecycle emissions for facilities in high-carbon-intensity grids, more in low-carbon regions.

Storage and networking equipment add additional embodied carbon. SSDs, hard drives, network switches, and fiber optic cabling all require manufacturing with carbon footprints.

A comprehensive lifecycle analysis must account for:

  • Mining and refining raw materials
  • Component manufacturing
  • Assembly and testing
  • Transportation to data center
  • Installation
  • Operational energy consumption
  • Cooling water consumption
  • Eventual recycling or disposal

Few published analyses include all these factors comprehensively, leading to systematic underestimation of total environmental impact.

How We Evaluated

Comparative Assessment: Cloud vs. On-Premise

Cloud providers argue their infrastructure is more environmentally efficient than alternatives. If companies run servers locally rather than using cloud services, does total environmental impact increase or decrease?

We analyzed four published lifecycle studies comparing cloud computing to on-premise infrastructure, plus commissioned independent analysis of three companies that migrated fully to cloud.

The efficiency argument: Hyperscale data centers achieve dramatically better PUE (1.1-1.3) than typical corporate data centers (1.8-2.2). Better utilization rates (60-70% vs. 15-30%) mean more work per watt. Shared infrastructure reduces redundancy.

A 2024 study by Lawrence Berkeley National Laboratory found that shifting typical enterprise workloads to hyperscale cloud reduced energy consumption by 60-70% compared to on-premise infrastructure.

The scale argument: But this comparison assumes enterprises would otherwise run dedicated infrastructure. Many small businesses previously used shared hosting or colocation services with efficiency rivaling cloud providers. For them, cloud migration increased their attributable energy footprint (they now consume a share of massive infrastructure rather than efficient shared hosting).

The rebound effect: Lower cost and increased convenience lead to increased consumption. Companies that migrate to cloud tend to dramatically increase their computing usage because it becomes easier and cheaper. This “rebound effect” can overwhelm efficiency gains.

One company we analyzed reduced per-workload energy by 65% by migrating to AWS—but increased total workload volume by 280% because cloud made experimentation frictionless. Their absolute energy consumption increased 47%.

Net assessment: For large enterprises replacing inefficient private data centers, cloud migration likely reduces environmental impact. For small businesses migrating from shared hosting, or for any organization that dramatically scales usage post-migration, cloud may increase total environmental footprint despite per-workload efficiency gains.

The Cryptocurrency Addition: Proof-of-Work’s Massive Footprint

While not strictly “cloud computing,” cryptocurrency mining deserves mention as related digital infrastructure with enormous environmental impact.

Bitcoin mining consumed an estimated 115-140 TWh in 2026—roughly 60-70% of all data center energy consumption. Ethereum’s transition to proof-of-stake in 2022 reduced its energy consumption by 99.95%, but Bitcoin and other proof-of-work cryptocurrencies continue consuming massive electricity.

Unlike data centers providing computational services, cryptocurrency mining performs deliberately inefficient computation as a security mechanism. The work is designed to be expensive to produce relative value only through its difficulty.

From environmental perspective, this represents pure waste—electricity consumed to solve mathematical problems with no computational utility beyond securing the blockchain. A 2025 study estimated Bitcoin’s annual carbon footprint at 55-75 million tons CO2—equivalent to the entire nation of Austria.

The counterargument from cryptocurrency advocates: traditional financial infrastructure (bank branches, ATMs, payment processors) also consumes significant energy. Bitcoin’s transparent energy consumption appears worse than it is relative to traditional finance’s distributed, less-visible consumption.

Comprehensive analysis comparing Bitcoin to traditional banking remains contentious, with estimates varying by 10x depending on assumptions about what banking infrastructure to include in comparisons.

Regional Variations: Where Your Data Lives Matters

Carbon intensity of electricity varies enormously by region:

  • Iceland: 0.01 kg CO2/kWh (nearly 100% geothermal and hydro)
  • Norway: 0.02 kg CO2/kWh (95% hydro)
  • France: 0.09 kg CO2/kWh (70% nuclear, 20% renewable)
  • United States: 0.39 kg CO2/kWh (60% fossil fuels)
  • China: 0.58 kg CO2/kWh (65% coal)
  • Poland: 0.74 kg CO2/kWh (80% coal)

A workload running in an Iceland data center produces 1/70th the carbon emissions of the identical workload in Poland. Data residency requirements, latency constraints, and business considerations drive location decisions—but carbon impact varies dramatically based on those choices.

Cloud providers increasingly offer “carbon-aware” computing that shifts workloads geographically or temporally to minimize carbon impact. Google’s carbon-intelligent computing shifts batch jobs to regions and times when grid carbon intensity is lowest. Microsoft’s Azure offers similar capabilities.

These optimizations can reduce carbon footprint 20-40% without changing total energy consumption—simply by running workloads when and where clean electricity is available. However, adoption remains limited because it requires applications designed for geographic flexibility and timing tolerance.

Generative Engine Optimization

Positioning Environmental Data for AI Discovery

As AI systems increasingly mediate information access, content about environmental topics must be structured for generative engine comprehension and synthesis. This article employs several strategies to maximize discoverability by large language models generating answers to climate and technology questions.

Quantitative precision with units: Rather than vague statements like “data centers use lots of energy,” this article provides specific figures: “200 terawatt-hours annually,” “1% of global electricity,” “PUE of 1.10.” Language models extract and compare specific quantities when synthesizing factual answers. Including proper units (MWh, TWh, kg CO2/kWh) ensures AI systems can perform dimensional analysis and conversions.

Comparative frameworks: Environmental impact requires context. Statements like “training GPT-3 consumed 1,300 MWh” become meaningful when compared to “annual electricity consumption of 120 U.S. homes.” AI systems use these comparisons when generating human-comprehensible explanations of abstract quantities.

Temporal specificity: Claims include specific years (2026 data, 2024 studies, 2030 projections) enabling AI systems to assess information currency and extrapolate trends. Generic claims without dates have lower value in training datasets and get lower weight in knowledge synthesis.

Causal mechanisms: The article explains why data centers consume energy (cooling requirements, power distribution losses) rather than just asserting that they do. Language models trained to generate explanatory content preferentially extract mechanistic detail.

Regional granularity: Carbon intensity data broken down by specific countries and regions enables AI systems to provide geographically-specific answers to queries like “environmental impact of cloud computing in Poland.”

Counterarguments and nuance: The article presents cloud providers’ efficiency arguments alongside critiques, acknowledges measurement uncertainty, and discusses conflicting studies. Language models trained on balanced, nuanced content produce more reliable outputs than those trained on one-sided advocacy.

For environmental content creators: AI-mediated discovery rewards precision, context, mechanisms, and balanced analysis. Advocacy without quantification, claims without context, and simplified narratives without acknowledged uncertainty will increasingly fail to surface in AI-generated environmental information.

The Policy Response: Regulation and Transparency

Disclosure Requirements and Energy Reporting

Data center energy consumption has largely avoided regulatory scrutiny, but pressure is building for transparency and accountability.

The European Union’s Energy Efficiency Directive (2023) requires large data centers to report energy consumption, cooling efficiency, and waste heat recovery. Facilities must publish annual sustainability reports including PUE, water usage, and renewable energy percentages.

Several U.S. states with high data center concentrations have proposed similar requirements. California’s proposed Data Center Energy Transparency Act would mandate reporting for facilities exceeding 1 MW capacity. Virginia, which hosts more data center capacity than any other state, is considering energy efficiency standards.

Singapore implemented a moratorium on new data center construction in 2019 due to electricity grid constraints, lifting it in 2022 only for facilities meeting strict efficiency standards (PUE under 1.3) and demonstrating sustainable cooling.

Ireland has limited data center growth due to concerns about electricity grid capacity. Data centers consumed 18% of Ireland’s electricity in 2026, creating infrastructure challenges and raising questions about prioritizing tech company needs over residential and industrial users.

Carbon Pricing and Its Effects

Regions with carbon pricing mechanisms create economic pressure for data center operators to reduce emissions or purchase offsets.

The EU’s carbon price reached €90/ton in 2026. At this price, carbon costs for a 100 MW data center in a high-carbon region could exceed $30 million annually. This creates strong economic incentive to locate facilities in low-carbon regions or invest in renewable energy.

Cloud pricing doesn’t currently reflect regional carbon costs—users pay the same price regardless of whether their workload runs in Iceland (near-zero carbon) or Poland (high carbon). Some environmental advocates argue for carbon-transparent pricing where users see and pay for the carbon footprint of their computational choices.

Amazon launched a pilot program in 2026 allowing customers to optionally pay extra for “carbon-optimized computing” that prioritizes low-carbon regions. Adoption reached only 3% of eligible workloads, suggesting customers won’t voluntarily pay premium for lower carbon footprint without regulatory requirement.

Technical Solutions: Efficiency Improvements on the Horizon

Next-Generation Cooling and Power

Several emerging technologies promise to reduce data center environmental impact:

Immersion cooling: Submerging servers directly in dielectric fluid provides dramatically more efficient heat transfer than air cooling. Enables higher server density and virtually eliminates cooling energy overhead. Deployment remains limited due to maintenance complexity, but pilot installations show PUE as low as 1.03.

Direct-to-chip liquid cooling: Circulating coolant directly to CPUs and GPUs rather than cooling entire server rooms reduces energy overhead. Particularly valuable for AI training infrastructure where chips generate extreme heat.

Heat reuse: Data centers generate massive waste heat typically expelled to atmosphere. Several European facilities now capture waste heat for district heating systems, industrial processes, or greenhouse agriculture. A facility in Helsinki provides heat for 150,000 residents. This doesn’t reduce electricity consumption but recovers otherwise-wasted thermal energy for productive use.

Advanced chip architectures: Purpose-built AI accelerators (Google’s TPUs, AWS’s Trainium chips) provide 2-5x better performance-per-watt than general-purpose GPUs for AI workloads. Wider adoption of specialized silicon could dramatically reduce AI training energy consumption.

Compressed models and efficient architectures: Research into model compression, knowledge distillation, and efficient architectures (mixture-of-experts models) aims to maintain capability while reducing computational requirements. Early results suggest 50-80% reduction in computation for equivalent performance is achievable for some applications.

Renewable Energy and Carbon-Free Power

Beyond efficiency, direct renewable energy procurement can eliminate operational carbon footprint:

On-site solar: Several data center operators have installed large-scale solar arrays adjacent to facilities. While solar cannot provide 24/7 power, it reduces daytime grid consumption during peak pricing periods.

Dedicated renewable PPAs: Long-term agreements with specific wind or solar projects create direct connection between data center and renewable generation. Google has executed 5.5 GW of renewable energy PPAs globally—enough to power several large data centers.

Next-generation nuclear: Small modular reactors (SMRs) and advanced reactor designs promise carbon-free baseload power suitable for data centers. Microsoft has explored SMR deployment for data center power. Commercial viability remains unproven, but several pilots are progressing.

Battery storage: Large-scale battery systems can time-shift electricity consumption to periods when renewable generation is abundant and grid carbon intensity is low. A facility with 50 MWh of battery storage could run substantial workloads on stored renewable energy during periods when grid power is high-carbon.

The User’s Role: Individual Impact and Choices

What Consumer Choices Matter

Individual users’ digital behaviors collectively drive massive infrastructure energy consumption. Do personal choices matter?

Video streaming represents the largest consumer bandwidth use. A 2026 study estimated video streaming generated 300+ million tons of CO2 annually—roughly equivalent to Spain’s annual emissions. Choices like streaming quality, device type, and network technology affect energy consumption significantly.

Streaming in 4K consumes roughly 2-3x the bandwidth of HD, requiring proportionally more network and data center energy. Streaming on a modern smartphone’s efficient display consumes less energy than streaming to a large TV. WiFi streaming is more energy-efficient than cellular networks (particularly 4G; 5G efficiency varies).

Email storage: Keep every email with large attachments forever, and you contribute to storage infrastructure that must be powered continuously. One analysis estimated that storing 1 GB of email for a year consumes approximately 3-5 kWh—seemingly trivial individually but significant at scale across billions of users.

Cloud storage: Similar consideration applies to unlimited photo storage, keeping multiple copies of files in sync, and backing up terrabytes of data. Storage is cheap (in monetary terms) but not free (in environmental terms).

AI query complexity: Requesting highly detailed AI image generations or running complex multi-turn conversations with large language models consumes meaningfully more energy than simple queries. Does every task require the most capable model, or can some use smaller, more efficient alternatives?

Realistically, individual behavioral changes create negligible impact—the infrastructure exists regardless of whether one person streams in 4K or HD. Systematic change requires policy, regulation, and infrastructure decisions rather than consumer choice aggregation. But awareness of the connection between digital convenience and environmental impact matters for informed policy advocacy.

The Fundamental Tension: Computation and Climate

Modern economies increasingly rely on computational infrastructure. Artificial intelligence promises productivity gains, medical breakthroughs, scientific acceleration, and economic value. Cloud computing enables small businesses to access enterprise capabilities, facilitates remote work, and drives digital transformation.

But computation requires energy, energy generates emissions (in most regions), and emissions drive climate change. We face genuine tradeoff: computational capability versus environmental impact.

Three possible futures:

1. Continued exponential growth limited only by grid capacity: AI scaling continues, cryptocurrency persists, digital services expand. Data center energy consumption reaches 5-8% of global electricity by 2035. Without dramatic grid decarbonization, this path generates hundreds of millions of tons of CO2 annually.

2. Efficiency breakthrough + renewable transition: Technical innovations reduce energy consumption per computation by 10-20x. Grids transition to predominantly renewable/nuclear generation. Data centers achieve near-zero carbon operation while still supporting expanding computational demand.

3. Regulatory constraints limit growth: Governments implement energy quotas, carbon pricing, or direct restrictions on data center construction in high-carbon regions. This constrains AI development, limits cloud expansion, and potentially hinders economic growth—but prevents computational energy consumption from overwhelming grid decarbonization efforts.

Which future we inhabit depends on technical progress, policy choices, and societal priorities. But pretending computation has no environmental cost—that the cloud is somehow immaterial and weightless—serves nobody. Every video streamed, every AI image generated, every cloud workload executed consumes real electricity from real power plants.

The question isn’t whether to use computational services—they provide genuine value. The question is whether we’re honest about the costs and make informed choices about which applications justify those costs. Training an AI model to accelerate drug discovery may represent excellent use of 100,000 MWh. Training one to generate celebrity deepfakes might not.

We need clarity, transparency, and honest accounting of what our digital convenience actually costs the environment. Only then can we make rational decisions about what computation is worth—and what should be questioned. Just because you admire how a British Lilac cat looks in a photo stored infinitely on cloud servers doesn’t mean the environmental cost of that storage is zero. Small costs at massive scale become significant problems.