• UK regulators have confirmed a penalty for Clearview AI, a  controversial facial recognition company.
  • A similar crackdown to the UK case has already started in the US, as this ruling comes two weeks after a lawsuit was settled out of court between Clearview and the ACLU. 
  • One problem with facial recognition technology is that it often misidentifies minorities.

Facial recognition scanning a person up close but also several people in a crowd.

The facial recognition software industry is meeting legislative roadblocks in its efforts to scrape your pictures from the internet, experts say. 

The UK’s data protection watchdog has confirmed a penalty for Clearview AI, a  controversial facial recognition company. The firm has collected images of people from the web and social media to create a global online database that police can use. 

“The practice of scraping people’s images and identities without their consent and performing facial recognition based on that data is questionably legal, and a serious violation of public privacy,” Avi Golan, the CEO of facial recognition company Oosto told Lifewire in an email interview. “Even used only by law enforcement agencies, this violates privacy and public confidence in the technology. The leakage of these capabilities into the private sector is a dangerous escalation.” 

Clearview did not immediately respond to a request from Lifewire seeking comment. 

Placing Limits

In Britain, Clearview is getting the cold shoulder. The country's Information Commission's Office said the company had broken data protection laws. Clearview was ordered to delete data it has on UK residents and banned from collecting more information.

“Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images,” John Edwards, the UK’s information commissioner, said in the news release.
The company not only enables the identification of those people but effectively monitors their behavior and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by fining the company and issuing an enforcement notice.”

One problem with facial recognition technology is that it often misidentifies minorities, John Bambenek, cybersecurity expert at Netenrich, a security and operations analytics SaaS company, told Lifewire via email. 

"The additional problem is that organizations, such as Facebook, for example, being an open-ecosystem, allow for the possibility that threat actors can poison the data with images to skew facial recognition," he added. "In the social media context, the risks are lesser, but as facial recognition is used for more important functions, the cost of errant recognition gets much higher."

Spreading Distrust of Facial Recognition

A similar crackdown to the UK case has already started in the US, as this ruling comes two weeks after a case was settled out of court between Clearview, and the ACLU, Mathieu Legendre, a data privacy senior associate for Schellman, a security and privacy compliance assessor, pointed out in an email to Lifewire. He said the settlement strongly limits Clearview’s business activities in Illinois, and in a less restrictive way, in the rest of the country.

A concept image of a facial recognition application focused on a person in an outdoor setting.

"According to this agreement, Clearview AI won't be able to sell its database in Illinois for five years and, with a few exceptions, will only be able to deal with federal agencies and local police departments in the rest of the country," Legendre added. 

The UK ruling is a sign of things to come in the United States, Steven Stransky, a law professor who teaches digital privacy at Case Western Reserve University, told Lifewire in an email interview. He said that within the last few years,  several state and local governments have implemented laws regulating the use of facial recognition technology, and he expects this trend to continue. 

Most of these laws focus on how local governments, and law enforcement, can collect, retain, and use data derived from facial recognition technology. However, legislation is also regulating how private businesses can employ facial recognition, Stransky said. New York City recently enacted a law prohibiting local businesses that collect biometric identifier information from profiting off the data and requiring them to disclose their use of facial recognition or other technology to gather such biometric data to customers with a “clear and conspicuous” sign.

"We will continue to see a rise in enforcement actions and litigation from government regulators, civil libertarian interest groups, and private citizens against organizations that violate facial recognition technology laws, and the ICO's fine against Clearview AI illustrates the significant costs associated with these types of claims," Stransky said. 

In a possible sign that Clearview recognizes the pushback it is facing by providing data to police, the company recently told Reuters that it plans to sell its technology to schools. The new program matches people to ID photos to allow access to physical or digital spaces.

Source