ITIF Logo
ITIF Search
Virtue-Signaling Connecticut Town Might Accidentally Ban Its Local Officials From Using iPhones

Virtue-Signaling Connecticut Town Might Accidentally Ban Its Local Officials From Using iPhones

September 18, 2021

Earlier this month a 72-year-old man was attacked and robbed walking down the street in Hamden, Connecticut by an unidentified assailant. The perpetrator was caught on a nearby surveillance camera, but the police have been unable to identify the suspect.

While the image in this case is too grainy to be of much use, the growing ubiquity of high-resolution cameras—in phones, dashcams, bodycams, and more—coupled with highly accurate facial recognition systems, means that criminals have a much harder time staying anonymous when they commit crimes in public, especially as algorithms improve, image quality grows, and costs decline.

However, a growing number of professional anti–facial recognition activists—led by the ACLU, which routinely fundraises off this issue—continue to lobby local governments for facial recognition technology bans. They falsely argue that the technology is racist, sexist, and a threat to free speech, and they dismiss or ignore all evidence showing how the technology keeps people safe.

Unfortunately, some local officials have fallen for these arguments hook, line, and sinker, and as a result, they have proposed hasty laws before investigating the issue thoroughly. Many may later find that they end up with a bad case of buyer’s remorse. Hamden, Connecticut looks to be the next such case study in how a rushed policy can do more harm than good.

Two members of the Hamden Legislative Council have introduced an ordinance that would ban all local officials from using facial recognition technology. Ostensibly, the ordinance is a reaction to reports that, more than a year ago, the Hamden Police Department trialed facial recognition software from one vendor, but decided not to purchase it. In other words, the council is proposing to ban technology that local law enforcement is not even using.

Tying the hands of police by prohibiting them from using technology to help identify individuals in photos would be a mistake. First, while local police are not using it today, they could certainly find themselves with a need for it in the future as technology changes. The ban is written so broadly that it would even prohibit police from using a computer database to simply narrow their searches, such as by filtering a database of booking photos to only those with blue eyes. Second, the blanket ban would not only prevent law enforcement from using facial recognition to identify suspects, but also to identify victims or witnesses. Third, the law leaves no room for exceptions or exigent circumstances, such as a school shooting, a child abduction, or a terrorist incident. Law enforcement will not have time to ask the council to change the law when the clock is ticking.

Perhaps worst of all, this proposed ordinance is not restricted to law enforcement. The ordinance does not appear to be thought through, because it would ban all Hamden officials, including “any officer, employee, agent, contractor, subcontractor or vendor” from being permitted to “obtain, retain, access or use” facial recognition technology. Biometrics are often used to secure facilities or in multi-factor authentication to online services. This ban is also bad news for any officials who use their faces to unlock their phones or computers at work. Like other cities that have leaped before looking, the council has effectively banned iPhones.

The irony here is that the Hamden Police Department does not currently use facial recognition, so there is no need to rush to regulate. Nor does it seem to be a priority for local constituents: At the most recent meeting, only 2 out of 11 residents who submitted comments discussed the facial recognition proposal. Most were concerned about a tree ordinance and funding for a digital program at the local library.

If the council is truly concerned about the issue, it should take time to learn more about the technology, the controls in place to prevent abuse, and the accuracy of systems on the market today. What they would find is that facial recognition technology is highly accurate, outperforming humans doing similar tasks, with less bias—and, as with most policing, good oversight and accountability are key to preventing missteps. Instead of a ban, they could set performance and testing standards for any facial recognition technology acquired by local law enforcement, ensure proper training and auditing, and establish transparency requirements. Any of those steps would be more reasonable than a ban.

The council has not yet conducted a final vote on its ordinance, so perhaps it will take time to think through this issue. Meanwhile, this flawed attempt at policymaking should be a wakeup call for the Department of Justice that it needs to act sooner rather than later to provide more guidance and recommendations to local officials so they can make policy based on the facts and evidence, not just the false claims of advocacy groups.

Back to Top