The Clearview AI saga continues!
If you haven’t heard of this company before, here’s a file A clear and concise summary From the French privacy regulator, CNIL (National Committee for Informatics and Freedoms), who readily published his findings and judgments in this long-running story in both French and English:
Clearview AI collects images from many sites, including social media. It collects all images that are directly accessible on these networks (that is, that can be viewed without logging into an account). Images are also extracted from online videos available on all platforms.
Thus, the company collected more than 20 billion photos around the world.
Thanks to this group, the company markets access to its image database in the form of a search engine through which it is possible to search for a person using a photo. The company provides this service to law enforcement authorities in order to identify the perpetrators or victims of crimes.
Facial recognition technology is used to query a search engine and find a person based on their photo. In order to do this, the company builds a ‘biometric model’, i.e. a digital representation of a person’s physical characteristics (the face in this case). This biometric data is particularly sensitive, especially because it is related to our physical identity (what we are) and enables us to identify ourselves in a unique way.
The vast majority of people whose photos have been collected in the search engine are not familiar with this feature.
Clearview AI has drawn the ire of companies, privacy organizations, and regulators over the past few years, including exposure to the following:
- Complaints and Class Actions Foot in Illinois, Vermont, New York and California.
- legal challenge From American Civil Liberties Union (ACLU).
- Stop and Stop Orders From Facebook, Google and YouTube, who considered Clearview Skimming activities You violated their terms and conditions.
- Repression procedures and ends in Australia and the United Kingdom.
- Judgment deems her work illegal In 2021 by the aforementioned French organizational.
No legitimate interest
In December 2021, CNIL stated, expressly removewhich – which:
[T]His company did not obtain the consent of the persons concerned to collect and use their images to provide its software.
Clearview AI has no legitimate interest in collecting and using this data either, especially given the intrusive and colossal nature of the process, which makes it possible to retrieve online images of tens of millions of Internet users in France. Those people, whose photos or videos can be accessed on various websites, including social media, do not reasonably expect their photos to be processed by the company to provide a facial recognition system that states can use for law enforcement purposes.
The severity of this breach led the CNIL chief to order Clearview AI to stop, for lack of legal basis, from collecting and using data from people on French soil, in the course of running the facial recognition software it markets.
Furthermore, CNIL has formed an opinion that Clearview AI does not seem to care much about compliance with European rules on the collection and handling of personal data:
The complaints received by CNIL revealed the difficulties the complainants faced in exercising their rights with Clearview AI.
On the other hand, the company does not facilitate the exercise of the data holder’s right of access:
- by limiting the exercise of this right to data collected during the twelve months preceding the application;
- by restricting the exercise of this right twice a year without justification;
- By responding to certain requests only after an excessive number of requests from the same person.
On the other hand, the company does not respond effectively to access and survey requests. Provides partial responses or does not respond at all to requests.
CNIL even published an infographic summarizing its decision and decision-making process:
Information Commissioners in Australia and the United Kingdom have reached similar conclusions, with similar findings to Clearview AI: Scraping your data is illegal in our jurisdictions; You should stop doing that here.
However, as we said in May 2022, when the UK stated that it would fine Clearview AI About 7,500,000 GBP (down from A fine of 17 million pounds was first proposedThe company is requested not to collect data on UK auditors anymore, It’s unclear how this will be set, let alone enforced.”
We may be about to find out how to monitor the company in the future, with CNIL lose patience With Clearview AI for not complying with its ruling to stop collecting biometric data for the French people…
… and declaring a fine of 20,000,000 euros:
After an unaddressed official notice, CNIL imposed a fine of €20 million and ordered CLEARVIEW AI to stop collecting and using data on individuals in France without a legal basis and to delete the data already collected.
As we’ve written before, Clearview AI appears to be happy not only to ignore regulatory rulings against it, but also to expect people to feel sorry for it at the same time, and to be on its side to deliver what it thinks is a vital service to society.
In the UK ruling, where the regulator took a similar line to France’s CNIL, the company was told that its behavior was illegal and undesirable and should stop immediately.
But reports at the time indicated that far from showing any humility, Clearview CEO Hoan Ton-That reacted with a sign of humility. opening feelings It wouldn’t be out of place in a tragic song:
It breaks my heart that Clearview AI was unable to assist when receiving urgent requests from UK law enforcement agencies seeking to use this technology to investigate cases of severe child sexual abuse in the UK.
As we suggested in May 2022, the company may find its many opponents responding with their own lyrics:
Cry me river. (Don’t act like you don’t know it.)
what do you think?
Does Clearview AI Really Provide a Useful and Socially Acceptable Service for Law Enforcement?
Or is it casually trampling on our privacy and presumption of innocence by illegally collecting biometric data, and marketing it for investigative tracking purposes without consent (and apparently, without limits)?
Tell us in the comments below… you may remain anonymous.