U.K. Regulator Fines Clearview AI $9.5 Million for Collecting Images

Britain’s Information Commissioner’s Office fined Clearview AI, a New York-based facial recognition company, 7.5 million pounds ($9.5 million) for misusing images of people in the United Kingdom and elsewhere that it collected from the Internet and social media sites to create a global online database used for facial recognition.

The ICO stated that it also has ordered the facial recognition data company to stop obtaining and using the personal data of U.K. residents that are publicly available on the web and to delete the data of U.K. residents from its systems.

“The company not only enables identification of those people, but effectively monitors their behavior and offers it as a commercial service. That is unacceptable,” said Information Commissioner John Edwards. “That is why we have acted to protect people in the U.K. by both fining the company and issuing an enforcement notice.”

The ICO enforcement action follows a joint investigation with the Australian Information Commissioner (OAIC) that began in November 2021 focusing on Clearview AI’s “use of people’s images, data scraping from the internet, and the use of biometric data for facial recognition,” the ICO stated.

According to the ICO, Clearview AI “collected more than 20 billion images of people’s faces and data from publicly available information,” and that “people were not informed that their images were being collected or used in this way.”

Given the high number of U.K. internet and social media users, Clearview AI’s database likely includes a substantial amount of data from U.K. residents, “which has been gathered without their knowledge,” the ICO stated.

Although Clearview AI no longer offers its services to U.K. organizations, the company has customers in other countries, so the company is still using personal data of U.K. residents.

According to the ICO, Clearview AI breached U.K. data protection laws by:

  • Failing to use people’s information in the UK in a “fair and transparent” way;
  • Failing to have a lawful reason for collecting the information;
  • Failing to have a process in place to stop the data being retained indefinitely;
  • Failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR); and
  • Asking for additional personal information, including photos, when asked by members of the public if they’re on the database.

International Investigation
The cross-border nature of the investigation is notable. ICO conducted the investigation in accordance with the Australian Privacy Act, the U.K. Data Protection Act 2018, and under the Global Privacy Assembly’s Global Cross Border Enforcement Cooperation Arrangement and the MOU between the ICO and the OAIC.

“People expect that their personal information will be respected, regardless of where in the world their data is being used,” Edwards stated. “That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.”

“This international cooperation is essential to protect people’s privacy rights in 2022,” Edwards added. “That means working with regulators in other countries, as we did in this case with our Australian colleagues, and it means working with regulators in Europe.”

Facial recognition remains a controversial technology. Last November, Facebook announced it was abandoning the use of a facial recognition system on its site. Other companies, including Microsoft and Amazon, have run into legal and regulatory problems or other criticisms over the use of facial recognition. And the City of San Francisco banned the use of facial recognition by police and other agencies in 2019.  end slug


Jaclyn Jaeger is a contributing editor at Compliance Chief 360° and a freelance business writer based in Manchester, New Hampshire.

PHOTO BY EFF PHOTOS, USED UNDER CREATIVE COMMONS 2.0 (CC BY 2.0)

Leave a Reply

Your email address will not be published. Required fields are marked *