Demand for cloud services makes it impossible for data centers to lower their carbon footprints – the goal now is to avoid using more energy than necessary, says VMware CTO.
One of the perennial concerns about data centers – at least in the last ten years or so – is the amount of energy they consume, and their subsequent share of carbon emissions. Estimates vary, but in 2017, data centers were generally credited for consuming about 3% of the world’s electricity and generating 2% of global greenhouse gas emissions (equivalent to the airline industry). By 2025, according to one study, data centers could account for up to 20% of global energy consumption, with a carbon footprint of over 5% of GHGs.
This isn’t a new issue – ten years ago, data center players were looking at ways to lower their carbon footprint via techniques like using less air-conditioning and/or more renewable energy sources.
But as data centers become the central component of the software-based digital infrastructure that we’ll need to run the coming digital economy – and as more compute-intensive apps like AI and cryptocurrencies enter the picture – the focus now is less on how to lower carbon footprints and more on how to keep it from getting too much bigger, and to wring the most efficiencies and value out of that carbon footprint regardless of size, according to VMware CTO Ray O’Farrell.
“You’re getting this enormous benefit from all of this digital infrastructure. So in my mind, the question is more, are you getting [that benefit] in a way where the efficiency is appropriate to what you’re getting?” he told Disruptive.Asia.
Not surprisingly, VMware credits virtualization technologies with making data centers more efficient – rather than running ten apps on ten servers, you can run them all on two servers without compromising on compute availability. The savings in cooling and power consumption in a such a scenario would be obvious, except that in reality the number of servers hasn’t been reduced – we’re just doing more with the servers already plugged in.
However, O’Farrell says, the efficiencies gained from virtualization have also prevented data center carbon footprints from growing larger than they would have otherwise.
“Our usage of digital infrastructure have gone up enormously, so the total amount of carbon associated with that has gone up. But the way it has gone up compared to what it could have gone up without this type of [virtualized] infrastructure is very, very different,” he says.
VMware tracks the energy usage of its own footprint (i.e. data centers using its virtualization software), and estimates that in the last 15 years it’s managed to avoid generating between 340 million and 350 million metric tons of CO2. O’Farrell reckons virtualization efforts outside of VMware’s particular footprint would probably double that number.
O’Farrell adds that VMware is actively helping its customers think in terms of carbon avoidance with a “carbon avoidance meter” tool launched in May that looks at data from VMware infrastructure to make smarter decisions about energy usage.
“We can see your compute footprint, so the customer gives us information about where you’re getting your power, and we can make the models of what that carbon footprint is, and also what they’re potentially avoiding by using Cloud A rather than Cloud B,” he says.
Moreover, he adds, the carbon avoidance meter tool can leverage the modern capabilities of big data analytics and AI to calculate carbon avoidance far beyond simply looking at your own energy consumption.
“I can look at a company and see that it’s running the same kinds of apps with the same software your company uses, and they’re doing it on the same scale, but they have a lower carbon footprint,” he explains. “So we can advise you on what other companies are doing to keep their footprint low, and what you could be doing.”
The irony, of course, is that AI is notoriously compute-intensive. A recent study in June found that the carbon footprint of training a single AI generates the equivalent of 284 tons of CO2 – five times the lifetime emissions of the average car.
O’Farrell acknowledges this, but returns to the point that the carbon footprint has to be weighed against the benefits that AI delivers. Also, he insists, it’s important to factor in the second-order effects of AI-powered apps that aim to introduce efficiencies elsewhere, from smart grids to smart traffic management.
“It’s easy to look at a big giant data center running AI and say, it’s burning this amount of carbon. But let’s say the AI in that data center is for management of autonomous vehicles in a densely populated city such as Hong Kong or Singapore, and the AI is able to improve the traffic situation in that city by 10% or 15% – that more than offsets the AI footprint,”
O’Farrell adds that eventually AI will enable the infrastructure itself to make decisions based on carbon avoidance, which could be just another performance parameter like speed, latency and security.
“If digital infrastructure is applied smartly to smart cities and things like that, yes, it will require building a data center, and yes, it will use some energy,” he says. “But the downstream effect of that could be more efficient traffic, more efficient cities, and more efficient energy usage.”
Be the first to comment