Any direction of technological progress has its own price - the downside, and in the case of the artificial intelligence industry, it was the prohibitive energy consumption. And today, in its mass, it is still produced by burning fuel, which generates emissions into the atmosphere. It is estimated that one session of neural network training leads to the emission of 284 tons of carbon dioxide, which is five times more than for the entire life of a conventional car.
Researchers from the University of Massachusetts (USA) studied the features of the operation of four fundamentally different modern AIs: Transformer, ELMo, BERT and GPT-2. They measured the daily energy consumption of each system and multiplied it by the time the neural network was fully trained according to its standard program. The resulting amount of energy was added to the emission generation formula at the US average and got very unpleasant values.
The situation is even worse when advanced neural architecture search (NAS) technology is used to train neural networks. It allows, without much difficulty, just by trial and error, to automate the neural network design process. This process is very laborious - the same Transformer initially spends 84 hours learning a new language, but with a NAS it would take as much as 270, 000 hours.
And this is just the tip of the iceberg - the calculations were carried out for specific, well-known neural networks. But how much larger-scale cloud platforms Google and Amazon actually work, what kind of power supply the capacities on which they are based are - an open question. But what is already known raises serious concern. The development of AI should not become a new source of threat to the ecology of our planet.