Chance of the calendar, just a few days from the summit for action on artificial intelligence (IA) organized on February 10 and 11 in Paris, the first measures of the European regulation AI Acts are in application, Sunday, February 2. If the symbol is strong, this first part currently only concerns certain prohibited uses.
The implementation of the text, the most ambitious in the world at this stage, will be progressive and is still the subject of debates, even disputes. These questions will not fail to be mentioned during the rally where the Elysée awaits 100 heads of state, especially American and Chinese, as well as actors from civil society and IA business leaders, including Elon Musk, Sam Altman (Openai) or Sundar Pichai (Google).
Concretely, this Sunday will be prohibited certain uses of AI deemed unacceptable by AI Act: Among them, the software of “Social notation”, private or public, like those used by China, and the AI of “Individual predictive police” aimed at “Profile” People by estimating their propensity to commit offenses. But also the “Recognition of emotions” At work or at school, to analyze the behavior of an employee or a student.
Likewise are banned “The exploitation of people's vulnerabilities, manipulation or subliminal techniques”. And, finally, the identification of people by real -time facial recognition in public space, as well as the biometric categorization of people to deduce “Their race, their political, religious opinions, their sexual orientation or their union membership”,, lists the European Commission site. Certain exceptions are however provided for the police.
Transparency obligations
However, the main components of the text will not be put into practice until later: the 1er August for AI models “For general use”like large models of text or image generation that serve as a basis for business uses or assistants such as Chatgpt, Gemini (Google) or Cat (Mistral). These will be bound by transparency obligations on their technical documentation and their training data. And the largest of these models will also be subject to safety audits on risks, especially Cyber.
You have 49.2% of this article to read. The rest is reserved for subscribers.
Source: Lemonde