The company focuses on biometrics storage and matching, including fingerprints, palm prints, irises, tattoos, and mugshots.[2]
Face Watch can continuously detect on live video streams, recognizing faces on individual video frames and cataloging timestamps.[1]
FACE Plus is the company's photo (still image) facial recognition program. It includes advanced filtering and can reconstruct a 3D model from photos to correct their angle, a feature called pose correction.[1]
DataWorks uses facial recognition algorithms from NEC,[2] Rank One Computing (of Colorado, CEO Brendan Klare),[2] and Cognitec.[6] Both the NEC and Rank One algorithms showed algorithmic bias in a NIST study.[7][2] DataWorks' EVP and GM, Todd Pastorini, told New York Times that although the company doesn't formally measure the accuracy or bias, DataWorks has "become a pseudo-expert in the technology".[2][6]
The Los Angeles County Sheriff, using Cognitec's algorithm, acquired in 2008 on a seven-year contract, and signed a seven-year, $3.5m contract extension in 2015. The contract renewal was approved unanimously by the Los Angeles County Board of Supervisors.[11] In addition to Interconnect, Los Angeles County has access to fingerprinting, facial recognition, tattoo matching (2 million images or templates), composite drawing, and access to DMV images in the unrelated Cal-Photo.[12]
Other California uses
The San Diego County Sheriff's use of DataWorks is well established to at least 2007. A report discussed DataWorks installing a trial of Face Plus for facial recognition that year. In 2010, the facial recognition system was in place and being upgraded to use Cognitec's algorithm.[13][14]
The San Francisco Police's most recent three-year contract was signed in 2017 for $150k per year.[15] It included what was labeled FR Software and a Face Plus server. It also included the "Mugshot database".[16] Beginning in May 2019, San Francisco banned government usage of facial recognition software. Pastorini said that the company's tools don't use neural nets or machine learning like Microsoft's Face API or Amazon Rekognition, stating, "The Amazon searches are not the best forensic searches". He also said that the ban is unfortunate because there wasn't evidence of misuse in San Francisco.
DataWorks says that it has sold over 5000 fingerprint devices in the state of Florida.[1]
Michigan
DataWorks has worked with the Michigan DMV (Department of Public Safety) and the Michigan State Police since 2001. Their database contains at least 8 million criminal images and 32 million DMV photos.[2] DataWorks and Michigan State Police integrated with the FBI's Next Generation Identification facial recognition pilot. Later, they added the Maryland Department of Public Safety's system to the FBI system. This system includes 25 agencies and over 1000 users.[17][1]
The City of Detroit and Detroit Police Department, with access to at least 500k mugshots, signed a 3-year $1m contract with DataWorks for "FACE Watch Plus real-time video surveillance".[1][18] It consumes feeds from Project Green Light, which is a network of over 500 cameras on public and private property.[19] It includes cameras on stoplights as well as gas stations, pharmacies, health clinics, churches, apartments, hotels, and (beginning in 2018) schools.[20] The 2017 DataWorks contract references being used for 100 video feeds. Additionally, the Crime Intelligence Unit is licensed to use the Michigan's Statewide Network of Agency Photos (SNAP) with DataWorks, adding access to DMV photos. DataWorks is integrated with Motorola's Command Central Aware Console.[21][22]
In a June 29, 2020 meeting, Detroit's police chief said Dataworks failed to make a correct identification 96% of the time. In a report, the Detroit police tally indicated they had used the Dataworks facial recognition for 70 images by June 22, year-to-date (2020). At least 68 of the 70 were on Black people, and 65 of the 70 were on men. Dataworks' Pastorini said they don't keep statistics, nor do they tell their customers how to use their software.[23]
In February 2023, a Black woman, eight months pregnant, was arrested for carjacking based on a DataWorks facial recognition match..[4] This was the third wrongful arrest in Detroit using the technology[5]
Chicago
The Chicago Police Department and Chicago Transit Authority are using "FACE Watch Plus" (realtime recognition) on Chicago's security camera network, which includes approximately 20,000 video cameras. This is integrated with Genetec's Omnicast. DataWorks noted they provide both "Real Time Screening" and "Facial Recognition". Their system includes 7 million criminal photos and states they use "the system primarily to solve crimes using probes generate dfrom street cameras, Facebook, and other sources."[20][17][1][2]
A national system upgrade in New Zealand is scheduled to be completed by late 2020 for an estimated NZ$5 million. It would include facial recognition from security camera still images, criminal images, firearms license holders, missing persons, and registered sex offenders. DataWorks's Pastorini said "we don't make accuracy statements" when asked about the reliability of the system. NZ's Privacy Commissioner wasn't even aware of the installation. When asked, Pastorini said he wasn't familiar with New Zealand's Privacy Act.[25][29][30]
^Aaron Mondry (8 July 2019). "Criticism mounts over Detroit Police Department's facial recognition software". Curbed Detroit. Retrieved 24 June 2020. In March this year during his State of the City address, Mayor Mike Duggan announced the "Neighborhood Real-Time Intelligence Program," a $9 million, state- and federally-funded initiative that would not only expand Project Green Light by installing surveillance equipment at 500 Detroit intersections—on top of the over 500 already installed at businesses—but also utilize facial recognition software to identify potential criminals.
^Jason Koebler (29 June 2020). "VICE - Detroit Police Chief: Facial Recognition Software Misidentifies 96% of the Time". vice.com. Retrieved 30 June 2020. "If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time," Craig said. "That's if we relied totally on the software, which would be against our current policy … If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify."