Clearview AI, Inc. is an American facial recognition company, providing software primarily to law enforcement and other government agencies.[2] The company's algorithm matches faces to a database of more than 20 billion images collected from the Internet, including social media applications.[1] Founded by Hoan Ton-That and Richard Schwartz, the company maintained a low profile until late 2019, until its usage by law enforcement was first reported.[3]
Use of the facial recognition tool has been controversial. Several U.S. senators have expressed concern about privacy rights and the American Civil Liberties Union (ACLU) has sued the company for violating privacy laws on several occasions. U.S. police have used the software to apprehend suspected criminals.[4][5][6] Clearview's practices have led to fines and bans by EU nations for violating privacy laws, and investigations in the U.S. and other countries.[7][8][9] In 2022, Clearview reached a settlement with the ACLU, in which they agreed to restrict U.S. market sales of facial recognition services to government entities.
Clearview AI was the victim of a data breach in 2020 which exposed their customer list. This demonstrated 2,200 organizations in 27 countries had accounts with facial recognition searches.[10]
History
Clearview AI was founded in 2017 by Hoan Ton-That and Richard Schwartz after transferring the assets of another company, SmartCheckr, which the pair originally founded in 2017 alongside Charles C. Johnson.[11][3] The company was founded in Manhattan after the founders met at the Manhattan Institute.[1] The company initially raised $8.4 million from investors including Kirenaga Partners and Peter Thiel.[12] Additional fundraising, in 2020, collected $8.625 million in exchange for equity. The company did not disclose investors in the second round. In 2021, another fundraising round received $30 million.[13] Early use of Clearview's app was given to potential investors in their Series A fundraising round. Billionaire John Catsimatidis used it to identify someone his daughter dated and piloted it at one of his Gristedes grocery markets in New York City to identify shoplifters.[14][15]
In October 2020, a company spokesperson claimed that Clearview AI's valuation was more than $100 million.[16] The company announced its first chief strategy officer, chief revenue officer, and chief marketing officer in May 2021. Devesh Ashra, a former deputy assistant secretary with the United States Department of the Treasury, became its chief strategy officer. Chris Metaxas, a former executive at LexisNexis Risk Solutions, became its chief revenue officer. Susan Crandall, a former marketing executive at LexisNexis Risk Solutions and Motorola Solutions, became its chief marketing officer.[17] Devesh Ashra and Chris Metaxas left the company in 2021.[13] In August 2021, Clearview AI announced the formation of an advisory board including Raymond Kelly, Richard A. Clarke, Rudy Washington, Floyd Abrams, Lee S. Wolosky, and Owen West.[18] The company claimed to have scraped more than 10 billion images as of October 2021.[19] In May 2022, Clearview AI announced that it would be expanding sales of its facial recognition software to schools and lending platforms outside the U.S.[20]
Clearview AI hired a notable legal team to defend the company against several lawsuits that threatened their business model. Their legal staff includes Tor Ekeland, Lee S. Wolosky, Paul Clement, Floyd Abrams, and Jack Mulcaire.[21][1][22] Abrams stated the issue of privacy rights versus free speech in the First Amendment could reach the Supreme Court.[21]
Usage
Clearview AI provides facial recognition software where users can upload an image of a face and match it against their database.[23] The software then supplies links to where the "match" can be found online.[24] The company operated in near secrecy until the release of an investigative report in The New York Times titled "The Secretive Company That Might End Privacy as We Know It" in January 2020. It maintained this secrecy by publishing fake information about the company's location and employees and erasing social media for the founders.[3][1][25] Citing the article, over 40 tech and civil rights organizations sent a letter to the Privacy and Civil Liberties Oversight Board (PCLOB) and four congressional committees, outlining their concerns with facial recognition and Clearview, and asking the PCLOB to suspend use of facial recognition.[26][27][28][1]
Clearview served to accelerate a global debate on the regulation of facial recognition technology by governments and law enforcement.[29][30] Law enforcement officers have stated that Clearview's facial recognition is far superior in identifying perpetrators from any angle than previously used technology.[31] After discovering Clearview AI was scraping images from their site, Twitter sent a cease-and-desist letter to Clearview, insisting that they remove all images as scraping is against Twitter's policies.[32] On February 5 and 6, 2020, Google, YouTube, Facebook, and Venmo sent cease and desist letters as it is against their policies.[33][34] Ton-That responded in an interview that there is a First Amendment right to access public data. He later stated that Clearview has scraped over 50 billion images from across the web.[29][35][36]
The New Zealand Police used it in a trial after being approached by Clearview's Marko Jukic in January 2020. Jukic said it would have helped identify the Christchurch mosque shooter had the technology been available. The usage of Clearview's software in this case raised strong objections once exposed, as neither the users' supervisors or the Privacy Commissioner were aware or approved of its use. After it was revealed by RNZ, Justice MinisterAndrew Little stated, "It clearly wasn't endorsed, from the senior police hierarchy, and it clearly didn't get the endorsement from the [Police] Minister... that is a matter of concern."[37][38]
Clearview's technology was used for identifying an individual at a May 30, 2020 George Floyd police violence protest in Miami, Florida. Miami's WTVJ confirmed this, as the arrest report only said she was "identified through investigative means". The defendant's attorney did not even know it was with Clearview. Ton-That confirmed its use, noting that it was not being used for surveillance, but only to investigate a crime.[39]
In December 2020, the ACLU of Washington sent a letter to Seattle mayor Jenny Durkan, asking her to ban the Seattle Police Department from using Clearview AI.[40] The letter cited public records retrieved by a local blogger, which showed one officer signing up for and repeatedly logging into the service, as well as corresponding with a company representative. While the ACLU letter raised concerns that the officer's usage violated the Seattle Surveillance Ordinance, an auditor at the City of Seattle Office of the Inspector General argued that the ordinance was designed to address the usage of surveillance technologies by the Department itself, not by an officer without the Department's knowledge.[41]
After the January 6 riot at the United States Capitol, the Oxford Police Department in Alabama used Clearview's software to run a number of images posted by the Federal Bureau of Investigation in its public request for suspect information to generate leads for people present during the riot. Photo matches and information were sent to the FBI who declined to comment on its techniques.[5]
In March 2022, Ukraine's Ministry of Defence began using Clearview AI's facial recognition technology "to uncover Russian assailants, combat misinformation and identify the dead". Ton-That also claimed that Ukraine's MoD has "more than 2 billion images from the Russian social media service VKontakte at its disposal".[42] Ukrainian government agencies used Clearview over 5,000 times as of April 2022.[43][44] The company provided these accounts and searches for free.[45]
In a Florida case, Clearview's technology was used by defense attorneys to successfully locate a witness, resulting in the dismissal of vehicular homicide charges against the defendant.[46]
Law enforcement use of the facial recognition software grew rapidly in the United States. In 2022 more than one million searches were conducted. In 2023, this usage doubled.[36]
Marketing efforts and pushback
Clearview AI encouraged user adoption by offering free trials to law enforcement officers rather than departments as a whole. The company additionally used its significant connections to the Republican Party to connect with police departments.[1][47] In onboarding emails, new users were encouraged to go beyond running one or two searches to "[s]ee if you can reach 100 searches".[48] During 2020, Clearview sold their facial recognition software for one tenth the cost of competitors.[3]
Clearview's marketing claimed their facial recognition led to a terrorist arrest. The identification was submitted to the New York Police Department tip line.[49] Clearview claims to have solved two other New York cases and 40 cold cases, later stating they submitted them to tip lines. NYPD stated they have no institutional relationship with Clearview, but their policies do not ban its use by individual officers. In 2020, thirty NYPD officers were confirmed to have Clearview accounts.[3] In April 2021, documents obtained by the Legal Aid Society under New York's Freedom Of Information Law demonstrated that Clearview had collaborated with the NYPD for years, contrary to past NYPD denials.[50] Clearview met with senior NYPD leadership and entered into a vendor contract with the NYPD.[48] Clearview came under renewed scrutiny for enabling officers to conduct large numbers of searches without formal oversight or approval.[50][48]
The company was sent a cease and desist letter from the office of New Jersey Attorney General Gurbir Grewal after including a promotional video on its website with images of Grewal.[51] Clearview had claimed that its app played a role in a New Jersey police sting. Grewal confirmed the software was used to identify a child predator, but he also banned the use of Clearview in New Jersey. Tor Ekeland, a lawyer for Clearview, confirmed the marketing video was taken down the same day.[4][52]
In March 2020, Clearview pitched their technology to states for use in contact tracing to assist with the COVID-19 pandemic.[53][54] A reporter found Clearview's search could identify him while he covered his nose and mouth like a COVID mask would.[45] The idea brought criticism from US senators and other commentators because it seemed the crisis was being used to push unreliable tools that violate personal privacy.[55][56]
Contrary to Clearview's initial claims that its service was sold only to law enforcement, a data breach in early 2020 revealed that numerous commercial organizations were on Clearview's customer list. For example, Clearview marketed to private security firms and to casinos.[57] Additionally, Clearview planned expansion to many countries, including authoritarian regimes.[58]
Senator Edward J. Markey wrote to Clearview and Ton-That, stating "Widespread use of your technology could facilitate dangerous behavior and could effectively destroy individuals' ability to go about their daily lives anonymously." Markey asked Clearview to detail aspects of its business, in order to understand these privacy, bias, and security concerns.[32][59] Clearview responded through an attorney, declining to reveal information.[60] In response to this, Markey wrote a second letter, saying their response was unacceptable and contained dubious claims, and that he was concerned about Clearview "selling its technology to authoritarian regimes" and possible violations of COPPA.[8][61] Senator Markey wrote a third letter to the company with concerns, stating "this health crisis cannot justify using unreliable surveillance tools that could undermine our privacy rights." Markey asked a series of questions about what government entities Clearview has been talking with, in addition to unanswered privacy concerns.[55]
Senator Ron Wyden voiced concerns about Clearview and had meetings with Ton-That cancelled on three occasions.[62][8]
In April 2021, Time magazine listed Clearview AI as one of the 100 most influential companies of the year.[63]
Technology
Accuracy
In October 2021 Clearview submitted its algorithm to one of two facial recognition accuracy tests conducted by the National Institute of Standards and Technology (NIST) every few months. Clearview ranked amongst the top 10 of 300 facial recognition algorithms in a test to determine accuracy in matching two different photos of the same person. Clearview did not submit to the NIST test for matching an unknown face to a 10 billion image database, which more-closely matches the algorithm's intended purpose. This was the first third-party test of the software.[19]
Clearview, at various times throughout 2020, has claimed 98.6%, 99.6%, or 100% accuracy. However, these results are from tests conducted by people affiliated with the company and have not used representative samples of the population.[29][64][65]
In 2021, Clearview announced that it was developing "deblur" and "mask removal" tools to sharpen blurred images and envision the covered part of an individual's face. These tools would be implemented using machine learning models that fill in the missing details based on statistical patterns found in other images. Clearview acknowledged that deblurring an image and/or removing a mask could potentially make errors more frequent and would only be used to generate leads for police investigations.[35]
Assistant Chief of Police of Miami, Armando Aguilar, said in 2023 that Clearview's AI tool had contributed to the resolution of several murder cases, and that his team had used the technology around 450 times a year. Aguilar emphasized that they do not make arrests based on Clearview's matches alone, and instead use the data as a lead and then proceed via conventional methods of case investigation.[24]
Several cases of mistaken identity using Clearview facial recognition have been documented, but "the lack of data and transparency around police use means the true figure is likely far higher." Ton-That claims the technology has approximately 100% accuracy, and attributes mistakes to potential poor policing practices. Ton-That's claimed accuracy level is based on mugshots and would be affected by the quality of the image uploaded.[24]
While Clearview's app is only supposed to be privately accessible to customers, the Android application package and iOS applications were found in unsecured Amazon S3 buckets.[68] The instructions showed how to load an enterprise (developer) certificate so the app could be installed without being published on the App Store. Clearview's access was suspended, as it was against Apple's terms of service for developers, and as a result the app was disabled.[69] In addition to application tracking (Google Analytics, Crashlytics), examination of the source code for the Android version found references to Google Play Services, requests for precise phone location data, voice search, sharing a free demo account to other users, augmented reality integration with Vuzix, and sending gallery photos or taking photos from the app itself. There were also references to scanning barcodes on a drivers license and to RealWear.[70]
In April 2020, Mossab Hussein of SpiderSilk, a security firm, discovered Clearview's source code repositories were exposed due to misconfigured user security settings. This included secret keys and credentials, including cloud storage and Slack tokens. The compiled apps and pre-release apps were accessible, allowing Hussein to run the macOS and iOS apps against Clearview's services. Hussein reported the breach to Clearview but refused to sign a non-disclosure agreement necessary for Clearview's bug bounty program. Ton-That reacted by calling Hussein's disclosure of the bug as an act of extortion. Hussein also found 70,000 videos in one storage bucket from a Rudin Management apartment building's entrance.[71]
Insight Camera
Clearview also operates a secondary business, Insight Camera, which provides AI-enabled security cameras. It is targeted at "retail, banking and residential buildings". Two customers have used the technology, United Federation of Teachers and Rudin Management.[72][73] The website for Insight Camera was taken down following BuzzFeed's investigation into the connection between Clearview AI and Insight Camera.[74]
Customer list
Following a data leak of Clearview's customer list, BuzzFeed confirmed that 2,200 organizations in 27 countries had accounts with activity. BuzzFeed has the exclusive right to publish this list and has chosen not publish it in its entirety.[10] Clearview AI claims that at least 600 of these users are police departments. These are primarily in the U.S. and Canada, but Clearview has expanded to other countries as well.[3] Although the company claims their services are for law enforcement, they have had contracts with Bank of America, Kohls, and Macy's. Several universities and high schools have done trials with Clearview.[10] The list below highlights particularly notable users.
Clearview AI has had its business model challenged by several lawsuits in multiple jurisdictions. It responded by defending itself, settling in some cases, and exiting several markets.
After the release of The New York Times January 2020 article, lawsuits were filed by the states of Illinois, California, Virginia and New York, citing violations of privacy and safety laws.[91] Most of the lawsuits were transferred to New York's Southern District.[92] Two lawsuits were filed in state courts; in Vermont by the attorney general and in Illinois on behalf of the American Civil Liberties Union (ACLU), which cited a statute that forbids the corporate use of residents' faceprints without explicit consent. Clearview countered that an Illinois law does not apply to a company based in New York.[21]
In response to a class action lawsuit filed in Illinois for violating the Biometric Information Privacy Act (BIPA), in May 2020 Clearview stated that they instituted a policy to stop working with non-government entities and to remove any photos geolocated in Illinois.[93][94][75] On May 28, 2020, ACLU and Edelson filed a new suit Clearview in Illinois using the BIPA.[95][96] Clearview agreed to a settlement in June 2024, offering 23% of the company (valued at $52 million at the time) rather than a cash settlement, which was likely to bankrupt the company.[97]
In May 2022, Clearview agreed to settle the 2020 lawsuit from the ACLU. The settlement prohibited the sale of its facial recognition database to private individuals and businesses.[98]
In the Vermont case, Clearview AI invoked Section 230 immunity. The court denied the use of Section 230 immunity in this case because Vermont's claims were "based on the means by which Clearview acquired the photographs" rather than third party content.[99]
Canada
In July 2020, Clearview AI announced that it was exiting the Canadian market amidst joint investigations into the company and the use of its product by police forces.[100] Daniel Therrien, the Privacy Commissioner of Canada condemned Clearview AI's use of scraped biometric data: "What Clearview does is mass surveillance and it is illegal. It is completely unacceptable for millions of people who will never be implicated in any crime to find themselves continually in a police lineup."[101] In June 2021, Therrien found that the Royal Canadian Mounted Police had broken Canadian privacy law through hundreds of illegal searches using Clearview AI.[102]
European Union and UK
In January 2021, Clearview AI's biometric photo database was deemed illegal in the European Union (EU) by the Hamburg Data Protection Authority (DPA). The deletion of an affected person's biometric data was ordered. The authority stated that the General Data Protection Regulation (GDPR) is applicable despite the fact that Clearview AI has no European branch.[103] In March 2020, they had requested Clearview AI's customer list, as data protection obligations would also apply to the customers.[104] The data protection advocacy organization NOYB criticized the DPA's decision as the DPA issued an order protecting only the individual complainant instead of an order banning the collection of any European resident's photos.[105]
In May 2021, the company had numerous legal complaints filed in Austria, France, Greece, Italy and the United Kingdom for violating European privacy laws in its method of documenting and collecting Internet data.[106] In November 2021, Clearview received a provisional notice by the UK's Information Commissioner's Office (ICO) to stop processing its citizens' data citing a range of alleged breaches. The company was also notified of a potential fine of approximately $22.6 million. Clearview claimed that the ICO's allegations were factually inaccurate as the company "does not do business in the UK, and does not have any UK customers at this time". The BBC reported on 23 May that the company had been fined "more than £7.5m by the UK's privacy watchdog and told to delete the data of UK residents".[107] Clearview was also ordered to delete all facial recognition data of UK residents. This fine marked the fourth of its type placed on Clearview, after similar orders and fines issued from Australia, France, and Italy.[9] However, in October 2023, this fine was overturned following an appeal based on the jurisdiction of the ICO over acts of foreign governments.[108]
In September 2024, Clearview AI was fined €30.5 million by the Dutch Data Protection Authority (DPA) for constructing what the agency described as an illegal database.[109] The DPA's ruling highlighted that Clearview AI unlawfully collected facial images, including those of Dutch citizens, without obtaining their consent. This practice constitutes a significant violation of the EU's GDPR due to the intrusive nature of facial recognition technology and the lack of transparency regarding the use of individuals' biometric data.[110]
^Ryssdal, Kai (September 19, 2023). "The facial recognition software cops are raving about". Marketplace. Event occurs at 07:00. APM. Marketplace.org. Retrieved September 19, 2023. I see you have a lot of photos on the internet you should be in the app but you're not here... A couple of minutes later he said he got a call from someone who worked for Clearview AI and they wanted to know why he'd been running my photo.
^Thomas, By Owen (January 22, 2020). "The person behind a privacy nightmare has a familiar face". San Francisco Chronicle. Retrieved January 23, 2020. I wrote about Ton-That in February 2009 ('scathingly,' Hill writes), when he was living in San Francisco, developing first Facebook and then iPhone apps. He made the news for creating ViddyHo, a website that tricked users into sharing access to their Gmail accounts — a hacking technique known as 'phishing' — and then spammed their contacts on the Google Talk chat app. (The episode does not appear on Ton-That's sanitized personal website.)
^ abHill, Kashmir (January 23, 2020). "Twitter Tells Facial Recognition Trailblazer to Stop Using Site's Photos". The New York Times. Retrieved January 26, 2020. Twitter sent a letter this week to the small start-up company, Clearview AI, demanding that it stop taking photos and any other data from the social media website "for any reason" and delete any data that it previously collected, a Twitter spokeswoman said. The cease-and-desist letter...accused Clearview of violating Twitter's policies.
^Igor Bonifacic (February 5, 2020). "Google tells facial recognition startup Clearview AI to stop scraping photos". Engadget. Retrieved February 6, 2020. Following Twitter, Google and YouTube have become the latest companies to send a cease-and-desist letter to Clearview AI, the startup behind a controversial facial recognition program that more than 600 police departments across North American use.
^Grind, Kirsten; McMillan, Robert; Mathews, Anna Wilde (March 17, 2020). "To Track Virus, Governments Weigh Surveillance Tools That Push Privacy Limits". The Wall Street Journal. Retrieved March 26, 2020. Clearview A.I. Inc., a facial-recognition startup that has sparked controversy among privacy advocates over its use by police departments, is in discussions with state agencies about using its technology to track patients infected by the coronavirus, according to people familiar with the matter. The technology has yet to be adopted by any agency, but the New York-based company hopes it will be helpful in what's known as "contact tracing"—figuring out who else might have been with a person known to have the virus.
^Belanger, Christian (May 12, 2020). "At virtual Booth roundtable, participants warn against hasty embrace of surveillance technology during pandemic". Hyde Park Herald. Retrieved May 13, 2020. Strahilevitz, for his part, alluded to recent news reports that the facial recognition company Clearview AI has offered to help federal and state governments with contract tracing during the pandemic. "When I hear about potential collaborations between the government and Clearview AI to use facial recognition I shudder," he said.
^"G2E: New generation of facial recognition enhances security, raises questions – CDC Gaming Reports". CDC Gaming Reports. Retrieved February 8, 2020. Sattar spoke Thursday at a G2E panel discussion on "Customer Identification Using Facial Recognition Technology: The Future is Now." Also on the panel were Jessica Medeiros Garrison, president of MDM27 Holdings, whose company Clearview offers facial recognition technology to law enforcement agencies
^Zach Whittaker (April 16, 2020). "Security lapse exposed Clearview AI source code". TechCrunch. Retrieved April 19, 2020. Ton-That accused the research firm of extortion, but emails between Clearview and SpiderSilk paint a different picture.
^"Clearview AI: When can companies use facial recognition data?". Global News. Retrieved March 10, 2020. On Sunday, the Ontario Provincial Police admitted to previously using Clearview AI, a New York City based facial recognition software company which scrapes billions of images off both public and social media websites.
^"London police clear up use of controversial Clearview AI facial recognition technology". 980 CFPL. Retrieved March 10, 2020. "Initial checks revealed that we were not using Clearview. That was wrong," Williams said, adding that after police had a published a statement denying the force's use of the software, a followup investigation revealed otherwise.
^Sawyer Bogdan (May 21, 2020). "London police Clearview AI review reveals 7 officers accessed the facial recognition technology". Global News. Retrieved May 23, 2020. At the London Police Services Board (LPSB) meeting on Thursday, London police Chief Stephen Williams revealed that seven officers accessed the software, with one of those officers using it in an investigation. 'Some of the members were made aware of the Clearview technology at a training seminar in November 2019, and it all surfaced at other training courses and other seminars,' Williams said.
^Kaminski, Margot E.; Skinner-Thompson, Scott (March 9, 2020). "Free Speech Isn't a Free Pass for Privacy Violations". Slate. Retrieved March 11, 2020. Hoan Ton-That, the CEO of Clearview AI, a company that sells the use of its facial recognition software to law enforcement, recently claimed that the First Amendment gives the company the right to scrape face photographs on public social media platforms. This claim not only ignores valid concerns about facial recognition technologies—their tendency toward discrimination, their use in pervasive location-tracking, including of activists or dissidents—but also gets the First Amendment wrong.
^"Clearview AI Says Facial Photo Data Scrape Claim Is Moot – Law360". law360.com. Retrieved May 8, 2020. The New York-based company says it's not subject to the BIPA because the alleged wrongful conduct occurred primarily and substantially in New York, not Illinois. It says it is voluntarily changing its business practices "to avoid including data from Illinois residents and to avoid transacting with non-governmental customers anywhere." "Specifically, Clearview is canceling the accounts of every customer who was not either associated with law enforcement or some other federal, state, or local government department, office, or agency," the company said. "Clearview is also canceling all accounts belonging to any entity based in Illinois. All photos in Clearview's database that were geolocated in Illinois have been blocked from being searched through Clearview's app."
^"ACLU Sues Clearview AI". ACLU. May 28, 2020. Retrieved May 29, 2020. The lawsuit was filed in Illinois state court on behalf of the ACLU, the ACLU of Illinois, the Chicago Alliance Against Sexual Exploitation, the Sex Workers Outreach Project, the Illinois State Public Interest Research Group (PIRG), and Mujeres Latinas en Acción. The groups argue that Clearview AI violated — and continues to violate — the privacy rights of Illinois residents under the Illinois Biometric Information Privacy Act (BIPA).