Cloud Sustainability at Scale: Why Open Source Will Define the Next Era of Green Computing
Data centers already account for 1.5% of global electricity demand, and the International Energy Agency projects that number could nearly double by 2030, driven in large part by AI workloads. The FinOps Foundation’s 2025 report found that 36% of global practices and 53% of European practices are now reporting carbon. The cloud’s energy footprint is quickly becoming a first-order engineering and business problem.
In this new reality, cloud sustainability must scale not just with infrastructure, but with the communities and open source ecosystems that developers already rely on.
Across the industry, we’re seeing policy pressure (like the EU’s CSRD), FinOps cost-efficiency mandates, and values-driven cultural shifts all converge on a single expectation: software needs to become carbon-aware by default. But achieving this requires rethinking how we observe, model, and optimize energy use across increasingly complex cloud environments.
Why Cloud Carbon Accounting is so Hard—and so Necessary
Most organizations dramatically underestimate their carbon footprint. That’s because more than 80% of cloud users’ emissions occur from Scope 3: the indirect impacts embedded in cloud supply chains and hardware manufacturing that can’t be seen through cloud infrastructure metrics at runtime.
To close this visibility gap, cloud users need two complementary measurement approaches. Cloud billing exports provide a top-down view that is helpful but coarse, while infrastructure-level metrics offer bottom-up granularity that is richer but harder to model. The tradeoff? Accounting-level accuracy versus engineering-level granularity. When these two streams come together, teams finally gain a realistic understanding of their energy usage and carbon emissions. Without both perspectives, sustainability decisions risk being incomplete or outright misleading.
The Emerging Stack: Kepler, SCI, KEIT, and the OSS Engine Behind Green Cloud
Much of the most important work in carbon-aware computing is happening in open source communities.
Kepler, a CNCF project I help maintain, uses eBPF and a machine learning model to estimate workload-level power consumption directly from Kubernetes resources for both bare metal and public cloud environments. This injects actionable sustainability insights into everyday cluster metrics. Alongside it, the Software Carbon Intensity (SCI) specification is an ISO standard that provides a standard methodology for quantifying the carbon impact of software. Meanwhile emerging tools like the Kubernetes Emissions Insights Tool (KEIT) combine Kepler and the SCI to make this data accessible to platform engineers.
These projects share a common theme: sustainability requires collaboration. No single team can map the full complexity of power modeling, energy forecasting, grid carbon intensity, and workload behavior. The problems are too large and moving too quickly to be solved behind closed doors. Community-driven standards and open instrumentation will become the backbone of carbon-aware systems.
Efficiency is the New Cloud Frontier
While long-term change depends on cleaner energy grids and better transparency from cloud providers, engineering teams already have powerful tools for reducing consumption. Scaling workloads more efficiently, through techniques like bin-packing, right-sizing, and intelligent autoscaling, can dramatically lower both energy use and cloud spend.
Kubernetes users adopting next-generation autoscalers such as Karpenter are already demonstrating this impact. Some organizations have cut compute costs by over 40% simply by improving scheduling efficiency. These same optimizations translate directly into lower carbon intensity, because wasted compute is wasted energy.
The AI Paradox: Why Efficiency Alone Won’t Save Us
While efficiency is one approach, it also introduces another complication: efficiency gains in AI can actually increase total energy consumption, a modern expression of Jevons’ Paradox. Making AI more efficient can make it cheaper, thus accelerating adoption and increasing overall compute demand.
The sustainability of AI, therefore, hinges not only on engineering efficiencies, but also on business incentives, governance and policy, market pressures, cultural norms and expectations.
This is where the movement toward 24/7 Carbon Free Energy (CFE) across major cloud providers matters. Engineering improvements must be matched with cleaner grids if we want AI-driven workloads to become environmentally viable at scale.
What’s Next: A Call to the Cloud Native Community
The cloud sustainability landscape is accelerating quickly, but the work ahead is bigger than any single platform or vendor.
It requires a shared understanding of how software consumes energy, community-driven models for measuring that impact, and transparent signals from cloud providers. It requires collaboration across open source, policy, and industry players. And it requires engineers who are willing to treat carbon efficiency with the same seriousness as cost efficiency or reliability.
The next decade of cloud growth and, specifically, AI growth, depends on whether we can make these systems not just powerful, but sustainable. The foundation is already being built in the open. Now it needs the support, contributions, and participation of the entire cloud native community.
Cloud sustainability at scale is possible. But it requires that we build it together.


