Nvidia and ARM are the unexpected stars of the current technological revolution. Until last year, news about them was rare or confined to specialist media. Today, these graphics chip manufacturers are making headlines in the mainstream press, and with good reason: in 2023, Nvidia’s market capitalization crossed the $1 trillion mark, and ARM completed one of the biggest technology IPOs of the last five years. And let’s not forget the size of this market, estimated to reach $228 billion by 2030.
228 billion dollars
Size of the graphics chip
market by 2030
Most chips are manufactured by non-European companies. While Nvidia, AMD and Intel lead the way, the GAFAs are not to be outdone. For example, AWS was able to win over Anthropic with its Inferentia and Trainium chips.
Europe’s under-representation in this field is not without consequence, as it risks undermining the sovereignty of European companies in the race for language models. It’s a well-known fact: the larger the language models a company manufactures, the greater its need for chips capable of accelerating artificial intelligence calculations.
In an interview with the Financial Times, DeepMind co-founder Mustafa Suleyman called on Washington to consider restricting Nvidia chips to companies committed to the ethical standards proposed by the White House. Made in the name of public interest, this appeal nonetheless testifies to the potential consequences of dependence on such a strategic component.
Encouraging French and European artificial intelligence companies to rush into the development of large-scale language models, when we don’t have the necessary infrastructure, is tantamount to putting their sovereignty up for auction.
Of course, nothing is set in stone. France and Europe are working towards AI chip sovereignty with the launch of the semiconductor megafactory in the French commune of Crolles. Similarly, private initiatives, like that of Xavier Niel who has invested €200 million to create a European AI ecosystem, are moving in the right direction.
But while we wait for these initiatives to materialize, Europe’s AI specialists need to find their own path. In this instance, that path would be one of frugality, characterized by the creation of small language models, with few parameters, designed to solve concrete use case. In addition to reducing their need for chips and thus mitigating their degree of dependence, this frugality is also logical in view of other challenges currently facing society.
In the medium term, large language models are unsustainable, be it in environmental, energy or technological terms. Let’s take the first two aspects. According to a study conducted by four doctoral students from the Universities of Riverside and Arlington, training ChatGPT-3 consumed 3.5 million liters of water. That’s the same amount needed to manufacture 370 BMWs or 320 Teslas. According to these researchers, ChatGPT-3 consumes 50 centiliters of water to answer 20 questions.
On the energy front, the situation is hardly any better. For example, to train the BLOOM language model, Hugging Face reported consuming 433 megawatt-hours of electricity. That’s enough to power 40 average American households for a year. These environmental risks should alert all AI players to the consequences of being fixated on mathematical performance. If Clément Delangue’s latest statements are anything to go by, perhaps such enlightenment will come sooner than expected. According to Hugging Face’s CEO, in 2024, companies will realize that “over 99% of use cases can be covered by smaller, less expensive and more specialized models.”