This is some text inside of a div block.
No items found.

AI: striving for a clean and sovereign Europe

Back to all articles
Date: January 31, 2024
Category: Blog article
Author: 
Jean-Baptiste Bouzige

In the chip sector, the dominance of non-European technology giants is jeopardizing European sovereignty in artificial intelligence. Against this backdrop, Europe needs to find and promote alternative solutions to meet both technological and environmental challenges.

Fill in the below to receive the Blog article

Requis*
Merci!
Vous pouvez maintenant télécharger le contenu en cliquant sur le lien ci-
dessous
Oops! Something went wrong while submitting the form.

Nvidia and ARM are the unexpected stars of the current technological revolution. Until last year, news about them was rare or confined to specialist media. Today, these graphics chip manufacturers are making headlines in the mainstream press, and with good reason: in 2023, Nvidia’s market capitalization crossed the $1 trillion mark, and ARM completed one of the biggest technology IPOs of the last five years. And let’s not forget the size of this market, estimated to reach $228 billion by 2030.

228 billion dollars
Size of the graphics chip
market by 2030

Most chips are manufactured by non-European companies. While Nvidia, AMD and Intel lead the way, the GAFAs are not to be outdone. For example, AWS was able to win over Anthropic with its Inferentia and Trainium chips.

Europe’s under-representation in this field is not without consequence, as it risks undermining the sovereignty of European companies in the race for language models. It’s a well-known fact: the larger the language models a company manufactures, the greater its need for chips capable of accelerating artificial intelligence calculations.

The European ecosystem

In an interview with the Financial Times, DeepMind co-founder Mustafa Suleyman called on Washington to consider restricting Nvidia chips to companies committed to the ethical standards proposed by the White House. Made in the name of public interest, this appeal nonetheless testifies to the potential consequences of dependence on such a strategic component.
Encouraging French and European artificial intelligence companies to rush into the development of large-scale language models, when we don’t have the necessary infrastructure, is tantamount to putting their sovereignty up for auction.

Of course, nothing is set in stone. France and Europe are working towards AI chip sovereignty with the launch of the semiconductor megafactory in the French commune of Crolles. Similarly, private initiatives, like that of Xavier Niel who has invested €200 million to create a European AI ecosystem, are moving in the right direction.

But while we wait for these initiatives to materialize, Europe’s AI specialists need to find their own path. In this instance, that path would be one of frugality, characterized by the creation of small language models, with few parameters, designed to solve concrete use case. In addition to reducing their need for chips and thus mitigating their degree of dependence, this frugality is also logical in view of other challenges currently facing society.

Environmental risks

In the medium term, large language models are unsustainable, be it in environmental, energy or technological terms. Let’s take the first two aspects. According to a study conducted by four doctoral students from the Universities of Riverside and Arlington, training ChatGPT-3 consumed 3.5 million liters of water. That’s the same amount needed to manufacture 370 BMWs or 320 Teslas. According to these researchers, ChatGPT-3 consumes 50 centiliters of water to answer 20 questions.

On the energy front, the situation is hardly any better. For example, to train the BLOOM language model, Hugging Face reported consuming 433 megawatt-hours of electricity. That’s enough to power 40 average American households for a year. These environmental risks should alert all AI players to the consequences of being fixated on mathematical performance. If Clément Delangue’s latest statements are anything to go by, perhaps such enlightenment will come sooner than expected. According to Hugging Face’s CEO, in 2024, companies will realize that “over 99% of use cases can be covered by smaller, less expensive and more specialized models.”

By moving away from the allure of large models, European companies can plan for the future and give themselves a major competitive edge. What’s more, thinking in terms of use cases offers a response to one of the major concuerns of investors: the profitability of generative AI.

Get in touch

Connect with our Data Science experts

Requis*
Merci!
Nous vous recontacterons très prochainement.
Oops! Something went wrong while submitting the form.