Advancements in, and the adoption of, machine learning and artificial intelligence is one of the big tech and business stories of 2019. But just as the world is concerned about the climate-related consequences of energy consumption, is the push for ML/AI development coming with a carbon cost—that is, could machine learning energy consumption cause a slowdown in development or adoption?
“There is a mad rush toward accuracy of AI models,” said Ganes Kesari, co-founder and head of analytics at Gramener. “All this model engineering and hyper-parameter tuning comes with a huge energy cost.”
A recent paper by University of Massachusetts researchers highlighted one issue with this rush, Kesari said: “It was found that AI models that use neural architecture search emit the carbon dioxide equivalent of nearly five times the lifetime emissions of an average American car.”
As the push for progress in the field continues, there are some concerns that the impressive computing power needed for machine learning and artificial intelligence systems will run into issues because of the high energy consumption needed for that computing work.
“Machine learning models, in particular, need to be trained against very large datasets to master certain tasks,” Sheldon Fernandez, CEO of DarwinAI, said. Image recognition systems, for example, require the identification of visual patterns—a very complex task made more manageable, if still difficult, by neural networks, he said.
“Training the network, however, requires thousands of hours of GPU and CPU time, which requires large amounts of power and would be significantly hindered by electricity limitations,” Fernandez said.
Machine learning systems can consume a lot of energy, agreed Matthias Alleckna, an energy expert at EnergyRates.ca. That said, the situation is not the same as with cryptocurrency mining, another computing-based industry considered an energy hog.
“In a sense, such systems can consume plenty of electricity, as they require large data centers and all of the costs associated with maintaining data protected from both physical and digital damages,” Alleckna said of ML-based systems.
But with cryptomining, he said, energy consumption increases significantly without necessarily bringing solutions to energy distributors and generators. Machine learning, on the other hand, brings with it potential benefits for the energy industry, enhancing energy usage at the same time as it is increasing energy consumption.
“It makes it more affordable, practical and predictable when it comes to both demand and prices,” Alleckna said. “Such a situation makes the machine-learning-electricity-consumption case a two-way road.”
Putting the Energy Monster on a Diet
There are ways to make machine learning less of an energy monster, Fernandez said. “Reducing the amount of data the network needs to be trained against is a major first step in reducing underlying electricity required and is an active area of research across the industry,” he said, pointing to DarwinAI’s Generative Synthesis platform as an example that reduces the size of the model itself, and therefore also reduces training time and energy consumption.
“For example,” he said, “we reduced a model by two times for an automotive client, saving them thousands of dollars per month on cloud spend and electricity costs.”
And much of the energy use associated with ML and AI is on the front end, Tom Debus, a managing partner of Integration Alpha, pointed out, because the models are usually trained initially, then fine-tuned and calibrated occasionally.
“The computational effort is, at most, cyclical in nature,” Debus said.
Three trends in particular point toward a greener future for ML and AI, Kesari said: green AI, energy-motivated mergers and AI-driven energy efficiency.
The industry has begun to look for efficiency in AI models, said Kesari, who expects that to continue. “Companies that tout to use AI in an environment-friendly way may soon be in favor,” he said.
Machine learning energy consumption needs will also affect enterprise energy adoption, Kesari said. He pointed to Microsoft’s July announcement of $1 billion in funding for OpenAI as one illustration of this trend in action.
“OpenAI CEO Sam Altman called out the extreme costs associated with scaling deep neural nets and the importance of cloud computing and deep pockets to fund such research,” Kesari said. “It looks unlikely that any nonprofit serious about AI research can sustain on its own.”
And as Alleckna said, AI and ML can be used to reduce energy consumption as well as consuming energy.
Kesari pointed to DeepMind AI’s 40% reduction of the Google data center cooling bill. Google is also using machine learning to better predict wind power, putting the technology to use to make energy production more efficient and profitable.
“AI is a master optimizer, so predicting energy costs and suggesting ways to reduce that is a task that AI can be trusted with,” Kesari said. “Perhaps, this makes up for part of the carbon footprint that AI takes up.”