Translate page with Google

Story Publication logo December 27, 2021

South Florida Police Widely Use Facial Recognition, yet Resist Policies To Curb Abuse. That’s a Problem for People of Color.

collage of individuals identified by facial recognition software on security cam footage in public

As police AI surveillance tech expands, what's the impact for minority communities?

Video camera footage compilation
Facial recognition software often relies on grainy still shots pulled off security cam footage, known as “wild” images, or “faces in motion in the real world,” such as these examples from a Pinellas County Sheriff’s Office training presentation. Image by Pinellas County Sheriff’s Office. United States, 2021.

In South Florida, failure to buckle a seatbelt can be enough for police to run a person through a facial recognition database. So can harvesting saw palmetto berries while trespassing on protected lands. Or being accused of stealing plants from a garden store. Or planning a peaceful Black Lives Matter protest against police brutality.

Despite a rising national trend to limit or ban the use of facial recognition — especially related to minorities, misdemeanors, and First Amendment activities — police agencies in South Florida widely deploy the troubled technology on residents via one of the nation’s oldest and largest image databases, Face Analysis Comparison & Examination System, or FACES.

And in two major police agencies, the majority of those Floridians are Black, according to a year-long Pulitzer Center-South Florida Sun Sentinel investigative project.

A review of thousands of images in which race was identified revealed that about 80% of scans by the Broward Sheriff’s Office and nearly 60% run by the Palm Beach County Sheriff’s Office involved Black individuals. Such rates exceed the region’s Black population and rates of arrest.

It’s a striking paradox: A technology that federal studies have shown misidentifies African Americans at higher rates is being used more often on people who are Black.

Critics say facial recognition is not ready for prime time, and that usage in communities of color, especially, poses risks to privacy, civil rights and liberties. Some compare expanding police use to a prison-like “panopticon,” an all-seeing surveillance network in the real world.

“Cameras are everywhere. And we’ve never seen this type of computing speed, data mining speed,” said Brian Hofer, executive director of Secure Justice, who has consulted with cities, such as San Francisco, on facial recognition policies and bans. “Police have never had this much power, and it’s an unjustified, dangerous shift in the balance of power between the government and the governed.”

South Florida police used facial recognition more than a dozen times during protests that cascaded across the region after a Minneapolis police officer killed George Floyd in May 2020. As people peacefully protested at Fort Lauderdale’s Huizenga Park, police scanned images of a “possible protest organizer,” and their various “associates.” No crimes were cited for the scans, as reported in an exposé this June by Pulitzer Center and Sun Sentinel journalists.

Using Florida’s public records laws, journalists dug deeper and found that Palm Beach and Broward sheriff’s offices combined are among the most prolific users statewide, with nearly 9,000 FACES database scans run over nearly a year and a half, from February 2020 through June 2021 ⎯ a pace rivaling densely populated urban areas like New York City.

Yet both agencies refuse to create policies governing searches, against even the recommendations of federal law enforcement agencies.

Overall, transparency is scarce. Following last summer’s revelations of facial scans during protests, the Fort Lauderdale Police Department is now moving to adopt a facial recognition policy for the first time. The West Palm Beach Police Department claimed it did not use the technology at all, but records reveal that it does.

Police are collecting people’s images from a rapidly growing number of sources: Ring doorbell cams, private or public security cameras, social media platforms, police smartphones, and evolving higher-def “inputs,” such as automated license plate readers and body worn cameras. Detectives or crime analysts upload those images into FACES, a database of some 13.5 million Florida mugshots and 25 million driver’s licenses and IDs, and get back a photo gallery of possible matches, including identifiers such as names, addresses, and birthdates. That means nearly any Floridian could pop up on a computer screen as a “possible candidate” in a law enforcement investigation.

The little-known trove of FACES records disclosed via public records requests offers a unique glimpse into the reasons cops cite for deploying the tech: “illegal guns,” “Home Depot thefts,” “Ring Video,” and “graffiti tagger.”

Disparate impacts on people of color

Fort Lauderdale Police Chief Larry Scirotto, brought on board in August, says strict police oversight is needed for the technologies they deploy, especially “facial recognition, something that can be either misused or not used for the correct purposes.”

U.S. Department of Justice officials agree, noting there’s no “uniform set of rules” in the country, and that a dearth of policies “raises concerns that law enforcement agencies will use face recognition systems to systematically, and without human intervention, identify members of the public and monitor individuals’ actions and movements,” according to a 2017 report.

Federal law enforcement, legislators, and privacy or civil rights advocates have recommended, among other measures, that police who use facial recognition or similar AI tech monitor for demographic bias, including audits to safeguard civil rights, in an effort to prevent “inaccurate, unfair, biased or discriminatory decisions impacting Americans.”

The Broward Sheriff’s Office, like other agencies, says it uses facial recognition for leads and not as a sole basis for an arrest. “FACES is but one of many investigative tools,” BSO officials noted in a statement. The department has no policy on facial recognition use. In general, the agency said it trains officers to avoid “misuse of electronic databases” and “the goal, as always, is to identify and arrest people who commit crimes that victimize others.”

The Palm Beach County Sheriff’s Office did not respond to requests for comment for this story. Previously, officials replied regarding policy issues: “PBSO is NOT looking to adopt a facial recognition policy,” spokesperson Teri Barbera wrote via email.

In several cases, people who were run through FACES had no crimes in court or state criminal records at the time other than traffic infractions. Others accused of petty theft were pulled into the court system, then declared incompetent to stand trial. Hispanic people were also run regularly through the database, which lists U.S. Immigration and Customs Enforcement among its partners. The Palm Beach County Sheriff’s Office, which ran more than 550 scans denoted Hispanic, also listed among search reasons “HSIN” — Homeland Security Information Network  raising the specter of racial profiling to check immigrant status, which has sparked controversy elsewhere in Florida.

For consistency, this review compared racial data recorded in FACES to demographic arrest data tallied for those agencies by the Florida Department of Law Enforcement. The Florida Department of Law Enforcement (FDLE) does not track Hispanic or ethnic data. In terms of race, the Broward and Palm Beach sheriff’s offices’ FACES scan rates for Black individuals exceeded the agencies’ already disparate average arrest rates  by more than 15% in recent years, records showed. For both sheriff’s offices, police didn’t enter any racial or ethnic data nearly half the time, a practice that makes it hard to track overall how people of color are impacted by facial recognition practices.

“It’s important to understand whether our policing systems are disproportionately focused on people who have been subjected to over-policing and marginalization, whether it’s compounding historical bias,” says Clare Garvie, co-author of a pivotal 2016 Georgetown University Law study titled The Perpetual Line-Up. “And facial recognition is part of that.”

In recent years, the tech has increasingly come under assault, including bans or moratoriums in more than 20 cities, and limits in several states. Facebook recently announced plans to shut down its facial recognition system. In November, President Joe Biden proposed an AI Bill of Rights, and a facial recognition moratorium bill is winding through the U.S. Congress. Bill co-sponsor Senator Edward J. Markey, D-Mass, noted: “We do not have to forgo privacy and justice for safety.”

Several South Florida police agencies declined to answer detailed questions about how they deploy the technology, including restrictions, how often used, or related arrest outcomes. The lack of transparency extends further into the halls of justice.

Palm Beach County Chief Assistant Public Defender Daniel Eisinger, echoing defense attorneys here and elsewhere, said he rarely sees facial recognition cited in police investigations within criminal case disclosures.

Eisinger’s office defended a young woman arrested on burglary charges in sporadic looting during the 2020 protests. Nakeria Harris, of Belle Glade, was identified based on security camera footage, police said. FACES records reveal Palm Beach County Sheriff’s Office also ran a facial recognition scan in her case, which included a smash-and-grab at the MÜV Medical Cannabis Dispensary in Wellington. Harris confessed to police, breaking down in tears.

Her defense team, however, would like to have known if facial recognition played a part in her arrest. “It’s tough on us that we don’t even know that they’re using it,” Eisinger said, noting that a defense team could bring in experts to question whether facial recognition might have been faulty, and possibly challenge the identification.

In another case, a woman was accused of stealing towering tropical plants and a potato bush from Amelia’s SmartyPlants in Lake Worth. The store’s security video captured images of a blond woman in a pink dress pushing the plants in a cart and walking to a car on Aug. 21, 2020. Four days later, Eva Horvath Gonda’s name turned up in the FACES database, records show.

A woman shown on a security camera pushing a cart of large plants<br />
Who is she? Four days after a woman was caught on a Lake Worth store’s security camera, the name Eva Gonda turned up in the FACES database. Image by SmartyPlants. United States, 2021.

PBSO officers said they identified Gonda based on video footage, a car license plate number, and witness ID, records show. It’s unclear whether facial recognition played a part in the case, which also ended up in Eisinger’s office. Yet how an investigation starts can affect how it goes.

Gonda was adamant that it wasn’t her, choosing to plead not guilty, and even demanding a jury trial in the $235 plant-theft case, which is pending.

“We know how witness identification can be erroneous, with a cop bringing a photo to a witness,” Eisinger says. “They get anchored into it and think ‘that’s gotta be her.’”

“My instinct is to worry there are many, many other cases out there they’re using it in,” Eisinger adds, “and we aren’t told.”

Police have countered that law enforcement is not required to disclose facial recognition use in court cases since such scans offer a lead, or “tip,” much like other investigative means. And judges’ rulings on such issues have been mixed.

Many cases rely on the initial photo’s quality, often grainy still shots pulled off security cam footage, known as “wild” images, or “faces in motion in the real world.”

How facial recognition works, or doesn’t

The technology scans people’s faces for biometric markers—spacing of eyes or width of a nose—for possible matches. Even police leaders note limits on how well it works, especially for people who are moving, low light scenarios, and false matches from look-alike faces, according to an October Major Cities Chiefs Association report. That report recommended best practices for police facial recognition technology use, including sharing “documented results” of a facial recognition investigation as part of discovery in criminal court cases.

A misidentification can have repercussions. In a 2019 National Institute of Standards and Technology (NIST) analysis, higher rates of false positives were prevalent for people of color, especially Black women. Wrong matches are concerning “because consequences could include false accusations,” the researchers wrote, though the accuracy of algorithms has been improving. In recent years, three Black men in New Jersey and Detroit were wrongfully arrested based on facial recognition.

Image quality comparison showing poor, nominal, and good quality examples where poor quality less than 50 pixels, face is an an odd angle, there is non-uniform lighting, and a high artifact content with background objects. Good quality has over 100 pixels, with a face directly at camera level, good lighting and no background content.
Facial recognition software often relies on grainy still shots pulled off security cam footage, known as “wild” images, or “faces in motion in the real world,” such as these examples from a Pinellas County Sheriff’s Office training presentation. Image by Pinellas County Sheriff’s Office. United States, 2021.

The Pinellas County Sheriff’s Office credits FACES in hundreds of “successful outcomes” for helping find suspects or witnesses. South Florida police have recently used the database to help identify culprits ranging from disorderly drunks to murderers, officials said.

A Palm Beach County identity-theft investigation was recently aided by a police search of FACES, Pinellas officials said. In April, a woman was arrested on charges she rented an apartment at the Maridadi Apartment Complex in West Palm Beach  in the name of another woman. The victim had received eviction notices and thousands of dollars in past-due rent bills. Gerlene Sterlin was accused of using a fake driver’s license that displayed her photo to rent the place. With an “investigative search of the photo,” Palm Beach sheriff’s detectives identified her as a suspect, according to the arrest report. Sterlin has pleaded not guilty.

Pinellas County Sheriff Bob Gualtieri, who helped spearhead the pioneering FACES database when the Pinellas County Sheriff’s Office launched it 20 years ago, says his agency has not found major problems on random audits since 2019 of nearly 300 police agencies using the database, and that auditors have mostly tagged missing police case numbers for searches.

Yet Gualtieri says individual agencies need to create their own written policies for facial recognition, especially large law enforcement agencies. “People need to know what the rules are in order to follow the rules, so they know what to do and also so the community knows what to expect,” Gualtieri says. “We should be very transparent because a lack of transparency breeds suspicion and suspicion breeds contempt.”

Fort Lauderdale to tighten rules

Police agencies that use FACES are more wide-ranging than people might think. The FBI and IRS are among FACES’ 269 local, state and federal partners, which search the database an average of 5,600 times each month, Pinellas officials said. Other users range from Biscayne National Park to Palm Beach County Schools police, as well as campus police at state universities including the University of Florida and Florida Atlantic University, according to a list of agencies released after a public records request.

A lack of transparency or policies can stymie even those who work within police agencies. The West Palm Beach Police Department repeatedly said they did not use facial recognition, with Assistant Chief Tameca West writing via email in recent weeks: “We do not and have never used Facial Recognition software nor are there any plans to use it in the future.”

Yet FACES records show that agency ran nine image scans during Black Lives Matter protests in 2020, with one search for “Identify Social Media Suspect” and another for “Identification Protest Leader” on May 31, when protests were underway. Only after further pressing did the agency note they use FACES. West says she was initially unaware department investigators “used a form of facial recognition.”

Accountability is a key issue for those pressing for written policies. In July, Christina Currie, an attorney and chair of Fort Lauderdale’s Citizens’ Police Review Board, brought the article on facial recognition use on protesters to a meeting with city leaders. She has been pressing ever since for Fort Lauderdale police to adopt a strict policy on the tech.

“It’s important to make sure this is being used for the purposes intended,” Currie said at the time, “that it’s not misused out of idle curiosity or abuse.”

Headshot of attorney Christina Currie in a green dress at a street intersection
Christina Currie, an attorney and longtime chair of Fort Lauderdale's Citizens' Police Review Board, is pressing Fort Lauderdale police to adopt a policy governing officers’ use of facial recognition technology. Image by Mike Stocker/South Florida Sun Sentinel. United States, 2021.

Chief Scirotto says facial recognition “should never be used for the identification of those involved in First Amendment practices.” He adds that police should get community input before adopting policing technologies: “We have missed the boat in that regard.”

FLPD’s recently released policy draft would bar officers from using facial recognition to surveil people based solely on race and ethnicity, religious beliefs, sexual identity or during “constitutionally protected activities.” Other oversight measures, which Currie says are a good start, include: having supervisors approve search requests to limit use “to investigative purposes,” as well as warning police against “succumbing to confirmation bias and focusing solely on a ‘top’ search result.”

Scirotto says facial recognition can be helpful in cases where “we have very little information to be investigating further or to develop potential suspects in a crime.” In terms of looking at racial bias issues, his agency will conduct quarterly audits, which include tracking demographics as the department does for traffic stops. The policy draft also notes officers should be aware of prior research on “algorithmic biases toward people of color.” Says Scirotto: “It matters that we’re accountable.”

Protest organizer Robert Holness, whose Black civil rights group, Leaders of Liberty, was specifically listed in FACES records, is not convinced: “It sounds like they’re taking a step to make it transparent, but I don’t see the point of this in the first place.”

Holness, whose organization is focused on community outreach such as winter toy drives, adds: “What’s the need for a system like that, which is more Big Brother-ish than anything?”

A tool wielded too often?

The Broward and Palm Beach sheriff’s offices ran nearly 9,000 searches during the year and a half reviewed. At that rate they are nearly keeping pace with New York City, which has roughly twice the population of Broward and Palm Beach counties. New York ran 22,000 scans over three years, according to public documents cited in a report by Amnesty International.

The Palm Beach County Sheriff’s Office, which logged the vast majority at more than 6,200 searches and averaged 12 searches a day, runs a regional fusion center that allows intel sharing between local, state, tribal, and federal agencies. Fusion centers, launched after the 9/11 attacks to combat terrorism and human trafficking, have drawn fire from states like Maine for surveilling racial justice advocates and environmental activists.

Pulitzer Center grantee interviewed in ABC News documentary on 9/11

Some police agencies now limit facial recognition use to felony crimes, including in Detroit after a Black man was wrongfully arrested on a shoplifting charge, in front of his wife and two young daughters, based on a faulty facial recognition match. The man has since sued that department for damages.

Pinellas’ Gualtieri said he has no problem with police using FACES for misdemeanors as well as felonies. “I don’t see a difference. You don’t know what you have when you stop someone for speeding and they play ‘the name game,’” that is, refusing to state their correct name. “You don’t know what you don’t know.”

Yet the Pulitzer Center-Sun Sentinel review of court records and state criminal histories indicates several people were run through FACES after they were cited for issues that weren’t even misdemeanors such as a red light camera violation, driving without headlights, or “Failure to Pay” fines. It remains unclear how database images or information yielded might have been used.

“You’re gonna find imperfections, unfortunately,” says Gualtieri. “There is room, and I said this all along, there is room to tighten this up.”

Critics especially question the use of high-end police tech for low-level incidents like traffic tickets, trespassing or “suspicious person,” which they say can fuel poverty and incarceration cycles. Hofer, of Secure Justice, notes that arrests for “minor stuff just makes the problem even worse.”

Currie says the deployment of policing technologies, even if helpful at times, should not cause more harm. “The harm is that more people in certain communities will bear the brunt of law enforcement interactions. And that’s not okay.”

Joanne Cavanaugh Simpson, a Pulitzer Center grantee, is a freelance journalist and former staff writer at the South Florida Sun Sentinel and Miami Herald. Her reporting project explores the expanding use of police surveillance technologies in urban communities, including the Baltimore region and South Florida.

South Florida Sun Sentinel staff writers Brooke Baitinger and Spencer Norris contributed to this report.