Is NYPD's 'SnitchBot' A Glimpse Into The Future Of Racist Policing?

The "Black Mirror"-esque "dogs" were creepy as hell — but we've got more to worry about.
New NYPD policing technology, including "Digi Dog" (center) and a K5 Autonomous Security Robot (ASR), are pictured during a press conference in Times Square on April 11 in New York.
New NYPD policing technology, including "Digi Dog" (center) and a K5 Autonomous Security Robot (ASR), are pictured during a press conference in Times Square on April 11 in New York.
New York Daily News via Getty Images

New York City Mayor Eric Adams and Police Commissioner Keechant Sewell have been pretty enthusiastic about the rollout of three new robocops, which Sewell called tools to “safeguard our modern city and a forward-looking world.”

The rest of us, however, are quietly panicking about a potentially “Black Mirror”-esque future where robots roam the streets, carrying out what will inevitably be some pretty racist policing.

“Digidog,” one of the robots, is rejoining the NYPD after having its contract cut short in 2020 due to overwhelming complaints that it was “creepy.”

“A few loud people were opposed to it, and we took a step back,” said Adams during a press conference last Tuesday. “That is not how I operate. I operate on looking at what’s best for the city.”

The four-legged, remote-controlled bionic structure is expected to make a comeback as a “safe way” for police to intervene during intense armed-hostage situations. Two 70-pound robodogs will cost the department $750,000 and are reportedly equipped with cameras, lights, and two-way communication technology, allowing police to collect surveillance and determine whether a potentially volatile situation is safe for police intervention.

During its initial pilot, Digidog was used once during a Bronx home invasion where two men held victims at gunpoint and burned one victim with a hot iron, according to The New York Times. Both suspects reportedly fled the scene before cops arrived.

The NYPD is also planning to pilot two other surveillance bots, one of which is the K5 Autonomous Security Robot. Some are calling this guy a “snitchBOT,” because it’s equipped with dozens of microphones, 360-degree cameras, sonar and a license plate reader. The plan for it is to autonomously patrol Times Square and subway stations, which doesn’t sound terrifying and disaster-prone at all.

It’s hard to determine how much “RoboCop”-style crime-fighting these innocuous-looking snitches will do, but let’s talk about the red flags first. One huge concern is whether these robots will show a racial bias and target Black and brown people — an issue rampant in policing and artificial intelligence. Just a few months ago, a man named Randal Reid was arrested and jailed for days in Atlanta when facial recognition technology incorrectly matched him as a suspect for a crime in Louisiana. It happens all the time — facial recognition tech does not have a great track record of telling us Black and brown people apart.

According to Harvard University, facial recognition used by police departments relies on mugshot databases, which are disproportionately Black and Hispanic (thanks largely to historically racist policing practices). History appears to be repeating itself in a different medium, resulting in some very dangerous inaccuracies. Using robots as law enforcement surveillance only reinforces the methods used to disproportionately strip Black people — and everyone else, actually — of their rights and humanity.

When we reached out to the NYPD’s press department for comment, a rep replied with a link to the above-referenced press conference. Sewell said at that conference that “there is a human being behind, and responsible for every mechanism we use, that is our approach to any technological implementation including the public safety including the public safety tools being introduced today.”

While these bots are being launched in New York City, we know other regions’ precincts aren’t far behind, so we need to talk about the repercussions now. Transparency is the minimum here: We need to know how the AI works, but also what the city’s plan is to combat faulty face recognition. If not, we can expect Digidog, K5 and other law enforcement AI to be exactly what it sounds like — expansions of racist policing practices.

Popular in the Community

Close

What's Hot