Home TECHNOLOGY Dirty Secrets of Artificial Intelligence

Dirty Secrets of Artificial Intelligence

by admin
0 comment 2 views 8 minutes read

Already daily activities such as consulting the best route to take or translating text require vast amounts of energy, hydraulic and mineral resources. Those applications run in the cloud, a euphemism for millions of powerful computers arranged in huge data centers. For mobile applications to work, millions of computers are needed that store trillions of data and perform operations in fractions of a second (for example, calculating distance taking into account traffic). It is estimated that the energy consumption of data centers represents between 1% and 2% of the world total. But everything is indicating that these figures are going to skyrocket.

generative artificial intelligence (AI), which makes possible chatbot Smart tools like ChatGPT, as well as tools that generate original artwork or music from text, require a lot of computing power. Large technology companies including Microsoft and Google have decided to integrate these functionalities into search engines, text editors or email. Our relationship with commonly used programs is about to change: till now, we used to press a series of commands to perform certain activities; Soon we’ll find ourselves talking to the machine, asking it to do things we used to do before.

What effect will this paradigm shift have on the environment? No one knows, but all estimates are on the upside. “AI may seem ethereal, but it is physically shaping the world,” says Kate Crawford Atlas of AI, The Australian, principal investigator at Microsoft Research and director of the AI ​​Now Institute, warned two years ago that the “planetary costs” associated with this technology didn’t add up. Some scientists calculated four years ago that the technology sector would account for 14% of global emissions by 2040; Others, that the energy demand for data centers will increase 15 times by 2030.

All those forecasts may fall short. They predate the dissolution of ChatGPT. Google and Microsoft accumulate hundreds of millions of users. What if they all start using tools powered by generative AI? Canadian Martin Bouchard, co-founder of the Kuscale data center, believes It would take at least four to five times more computational power for each search. When asked about their current consumption levels and their growth forecast in the era of generative AI, Google and Microsoft preferred not to provide specific data to this newsletter beyond reiterating their intention to achieve carbon neutrality by 2030. . Crawford, meaning that they offset their emissions by purchasing people’s debt” through acts of environmental makeup, such as planting trees or other similar actions.

One of the corridors of Google's data center in Douglas, Georgia (USA).
One of the corridors of Google’s data center in Douglas, Georgia (USA).

“Generative AI produces more emissions than a normal search engine, which also consumes a lot more energy, because after all, they are complex systems that dive into millions of web pages,” Carlos Gómez Rodríguez, Computing and Artificial Intelligence Says Professor of Intelligence. La Coruña University. “But AI still generates more emissions than search engines, because it uses architectures based on neural networks with millions of parameters that need to be trained.”

How much does AI pollute?

The carbon footprint of the computer industry caught up with aviation a few years ago when it was at its peak. Training a natural language processing model is equivalent to as many emissions as five gasoline-powered cars would expel over their lifetime, including the manufacturing process, or 125 round-trip flights between Beijing and New York. Beyond emissions, the consumption of water resources to cool the system (Google spent 15,800 million liters in 2021, according to a study of Nature, While Microsoft announced 3.6 billion liters), as well as the reliance on rare metals to make electronic components, make AI a technology with a large impact on the environment.

Training a natural language processing model is equivalent to emitting as many emissions as five gasoline-powered cars would expel over their lifetime.

There is no data on how much and what kind of energy is consumed by large technology companies, only an infrastructure large enough to train and feed the large language models on which generative AI is based. There are also no specific figures on the amount of water used to cool the system, an issue that is already causing tension in countries such as the US, Germany or the Netherlands. Companies are not required to provide such information. “We have estimates. For example, training GPT3, the model ChatGPT is based on, will generate about 500 tons of carbon, equivalent to driving a car to the moon and leaving, This may not be much, says Gomez, but it should be taken into account that the model needs to be re-trained from time to time to incorporate updated data. OpenAI has just introduced a more advanced model, GPT4. And the race will go on.

Another estimate states that the electricity use in January 2023 by OpenAI, the company responsible for ChatGPT, could be equivalent to the annual use of about 175,000 Danish households, which are not the highest spenders. “These are estimates of ChatGPT’s current numbers; If its use becomes even more widespread, we can talk about electricity consumption equivalent to millions of people”, says the professor.

Aerial view of the Google data center in Saint-Ghislain, Belgium.
Aerial view of the Google data center in Saint-Ghislain, Belgium.

Data opacity will soon begin to end. The European Union is aware of the increasing energy consumption of data centres. A directive underway in Brussels that will begin discussion next year (and, therefore, take at least two years to implement) that sets requirements for energy efficiency and transparency. America is working on a similar rule.

Expensive training of algorithms

“AI carbon emissions can be broken down into three factors: the power of the hardware being used, the carbon intensity of the power source powering it, and the energy used in the time it takes to train the model. ”, explains Alex Hernandez, postdoctoral researcher at the Quebec Institute of Artificial Intelligence (MILA).

It is in training where most emissions are concentrated. This training is an important process in the development of machine learning models, the type of AI that has grown most rapidly in recent years. It is shown millions of examples of the algorithm to help it establish patterns that allow it to predict situations. For example, in the case of the language model, it is that when you see the word “the earth is” you know that you have to say “round”.

OpenAI, the company responsible for ChatGPT, has electricity usage in January 2023 roughly equivalent to the annual usage of 175,000 Danish households.

Most data centers use advanced processors called GPUs to perform training on AI models. GPUs require a lot of power to run. According to a recent Morgan Stanley report, training large language models requires thousands of GPUs, which need to operate round-the-clock for weeks or even months.

“Large language models have very large architectures. A machine learning algorithm might need up to 50 variables to help you choose who to hire: where you work, what salary you have now, past experience, and so on. GhatGPT has over 175 billion parameters,” reflects Ana Valdivia, postdoctoral researcher in computing and AI at King’s College London. “You have to re-train all that kind of structure, and also host and exploit the data you work on. That storage is also consumed”, she adds.

Hernandez from MILA, Just submitted an article In which he analyzed the energy consumption of 95 models. “There is little variability in the hardware used, but if you train your model in Quebec, where most electricity is hydroelectric, you can reduce carbon emissions by a factor of 100 or more relative to coal, gas or other major locations. do”, emphasizes the researcher. Chinese data centers are known to be 73% powered by coal-generated electricity, resulting in the emission of at least 100 million tons of CO₂ in 2018.

Under the leadership of Joshua Bengio, whose contributions to deep neural networks earned him the Turing Prize (considered the Nobel Prize for computer science), MILA has developed a tool, code carbon, able to measure the carbon footprint of those who program and train the algorithms. The goal is for professionals to integrate it into their code so they know how much they emit and this helps them make decisions.

more computing power

The added problem is that the computing power required to train the largest AI models doubles every three to four months. This was already revealed by an OpenAI study in 2018, which also warned that “it is worth preparing for when systems need much greater capabilities than they currently have.” This is much faster than the rate dictated by Moore’s Law, which states that the number of transistors (or power) of microprocessors doubles every two years.

“Considering the number of models currently being trained, more computational power is needed to guarantee its operation. Certainly, the big technology companies are already buying more servers”, predicts Gomez.

For Hernandez, the emissions derived from the use of AI are of less concern for several reasons. “There is a lot of research aimed at reducing the complexity of the number of parameters and the energy that the model needs, and will improve. However, there aren’t many ways to reduce them in training: this is where weeks of heavy use are needed. The former is relatively easy to adapt; Second, not so much.

One of the possible solutions to make training less polluting would be to reduce the complexity of the algorithms without losing efficiency. “Do you really need millions of parameters to get a model that works well? For example, several biases have been found in the GhatGPT. Ways to achieve similar results with a simpler architecture are being investigated”, reflects Valdivia.

you can follow country technology In Facebook And Twitter or sign up here to receive our weekly newspaper,

Subscribe to continue reading

read without limits


You may also like

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More