Presentation
Preparations for the upcoming Paris Olympic Games have raised significant concerns regarding the security of the events and the spectators. In this context, a series of legislative and regulatory measures have been introduced to propose, on an experimental basis, the use of digital surveillance tools, known as algorithmic or biometric, designed to analyze images from surveillance cameras or drones.
Listen to our interview on France Info to discuss these issues.
Biometric Surveillance at the Olympic Games
As a reminder, biometric surveillance can be defined as any technology that allows for the identification of an individual based on their physical, biological, and/or behavioral characteristics. Since these characteristics are unique and permanent for each individual, they make them virtually unforgeable.
Following similar experiments conducted on SNCF (French National Railway Company) camera feeds in 2017, the surveillance system planned for the Olympic Games incorporates a tool for real-time analysis of images captured by surveillance cameras or drones. These images are then processed by artificial intelligence software.
The automated video surveillance (AVS) system will provide powerful tools for monitoring the behavior of people present during the events, with the aim of identifying behaviors deemed suspicious or abnormal, crowd movements, abandoned bags, and adjusting security protocols accordingly.
Automated Video Surveillance of High-Risk Recreational, Sporting, and Cultural Events
It is important to remember that the legal provisions authorizing automated video surveillance (AVS) are not limited to the Olympic Games alone, but also cover other types of cultural, recreational, and sporting events deemed high-risk, potentially extending until early 2025.
A key measure within the regulations governing the upcoming Olympic Games, it is also accompanied by other systems, including QR codes for movement, body scanners, and doping controls using genetic characteristics.
Risks and Infringements on Public Rights and Freedoms
The use of this variety of automated surveillance devices necessarily raises significant issues, particularly regarding the associated risks, including:
The continued use of these devices, justified by public health considerations (continuity of the health pass);
The lack of protection and information for individuals filmed regarding the data collected, storage periods, and guarantees of anonymization, blurring, or pseudonymization;
The tool’s potential for easy integration of facial or voice recognition methods.
Similarly, the preparation and population of the database necessary to train artificial intelligence currently offers few guarantees regarding the content used to predetermine abnormal or risky behavior, as well as its retention and/or destruction.
The risk of discriminatory and/or racial biases introduced by artificial intelligence raises even greater concerns.
Human Rights Defender, lawyers & associations
Many stakeholders have expressed their concerns regarding the measures implemented to ensure the smooth running of sporting events. In early 2024, the French Ombudsman denounced the risks stemming from restrictions on movement within testing areas, the use of cameras and drones for data processing, the eviction of individuals deemed “undesirable” or exhibiting “abnormal” behavior, and the placement of homeless people in temporary shelters.
As the guardian of data protection rights, the CNIL (French Data Protection Authority) had adopted a cautious stance regarding the GDPR compliance of automated video surveillance measures, with some even accusing it of transforming itself into a “data market regulatory authority.”