Clearview AI to restrict sales of facial images to settle lawsuit

Startup also must stop offering free trials to police officers without their employers’ approval.

Ben Wodecki

May 10, 2022

3 Min Read

Startup also must stop offering free trials to police officers without their employers’ approval.

Facial recognition AI startup Clearview has agreed to settle a dispute brought against it by civil rights groups over its facial data scraping technologies.

New York-based Clearview was accused of violating Illinois’s biometric privacy law, the Biometric Information Privacy Act (BIPA). The legislation, considered one of the toughest privacy laws in the U.S., requires opt-in consent to obtain someone’s faceprint.

The American Civil Liberties Union (ACLU) announced on Twitter that Clearview had agreed to the settlement – which restricts the company from selling its faceprint database.

The agreement will also see the startup stopping its free trial accounts to individual police officers, without the knowledge or approval of their employers. Previously, its services were offered to law enforcement agencies on a free trial basis, however, this would be later discontinued.

Describing the settlement as a “huge win,” the ACLU said the startup “treated people’s biometrics as unrestricted sources of profit and ignored the danger that comes with tracking faceprints.”

“Today’s settlement… is an important step in defending our right to privacy in the digital age.”

Clearview CEO Hoan Ton-That said in a statement to AI Business that the company's "posture regarding sales to private entities remains unchanged. We would only sell to private entities in a manner that complies with BIPA. Our database is only provided to government agencies for the purpose of solving crimes. We have let the courts know about our intention to provide our bias-free facial recognition algorithm to other commercial customers, without the database, in a consent-based manner."

"Today, facial recognition is used to unlock your phone, verify your identity, board an airplane, access a building, and even for payments. This settlement does not preclude Clearview AI from selling its bias-free algorithm, without its database, to commercial entities on a consent basis, which is compliant with BIPA."

Clearview under the cosh

Clearview has been the subject of much scrutiny in the past few years. Lawmakers in Canada, Australia and the U.K. would all raise concerns regarding its facial scraping technology. The data watchdog in the U.K. would go on to hit the company with a provisional $23 million fine for illicit scraping of images. Clearview maintains its innocence in this dispute.

In the ACLU v. Clearview AI case, the civil rights group sued on behalf of groups representing vulnerable communities uniquely harmed by face recognition surveillance, such as survivors of domestic violence and sexual assault, undocumented immigrants and others. It alleged that Clearview committed repeated violations of the BIPA by capturing and using facial data of the respondents without their knowledge and permission.

The settlement now requires Clearview to maintain an opt-out request form – letting Illinois residents upload a photo to ensure their faceprints will be blocked from appearing in their search results — including searches by police.

Clearview still faces similar legal action in California.

Perhaps in a bid to improve its image, Clearview in March offered the Ukrainian government use of its software to identify casualties and prevent misinformation from fake social media posts. The following month, it said it would move away from working with law enforcement.

About the Authors

Ben Wodecki

Assistant Editor

Get the newsletter
From automation advancements to policy announcements, stay ahead of the curve with the bi-weekly AI Business newsletter.