PUBLISHED IN DEADLINE DETROIT, AUGUST 14, 2019
Just over a month ago, the Detroit Police Department proposed a series of policies to govern facial recognition technology in police surveillance programs. The policy proposals surprised most Detroiters, who had no idea facial recognition was being used and couldn’t believe that DPD had gone a year without a policy for this capability.
Public outcry over the lack of transparency and the potential threat to privacy apparently caught Police Chief James Craig off-guard, but he quickly assured us that proposed policies would address any concerns. In effect, DPD said: Trust us, this is all to keep you safe.
In a city with too many carjackings, too many murders, too many robberies, it’s tempting to accept DPD’s response, taking comfort in the notion that something – anything – is being done about the crime suffocating too many neighborhoods. Unfortunately, history clearly shows that “trust us” is a hollow promise from law enforcement that ignores reality and avoids the necessary conversation about the limits of police authority.
We’re all in the mugshots
At its most basic, facial recognition technology works by comparing the unique features of a person in one image to images in a database. Chief Craig has compared it to a witness looking through a book of mugshots.
Except the witness is artificial intelligence and can look at everybody – all the time – even when there hasn’t been a crime. And the pictures in the book include you. And your friends and family. Even if you’ve never been arrested. And it can include your real-time location. And driver’s license information. And social media info. And anything else you’ve ever put online. And it’s more likely to misidentify you if you’re black, or a woman, or a child.
Perhaps the best description of the facial recognition capabilities comes from the maker of software DPD uses in surveillance programs:
The system can be used in a permanent setup or in a mobile application. . . . When a match occurs, users are able to view the probe image from the video at the same time as the match. Agencies can customize what data is available for the matched image from the watch list database. . . .
Your agency can create a watch list database that includes images and information about persons of interest or wanted individuals. Records can be added manually or added from an existing database. Real-time facial recognition searches can be run against this database.
This is what the software can do. DPD assures us that it will be responsible in surveillance of Detroiters and will not use the full capabilities, despite having spent $1 million on the program.
Don’t ask for proof
Trust us, DPD says, we will only use facial recognition technology when there is a criminal investigation — because there is always a criminal investigation.
Trust us, we won’t use it on live video — though we might let the State Police or FBI do so.
Trust us, we won’t use it to enforce immigration laws — though we might let ICE do it.
Trust us, real-live humans review every image — so we don’t have to keep track of how many mistakes the technology makes.
Trust us, our surveillance programs are working — just don’t ask for proof.
Most of us think of our rights as something the government gives us, but really, it’s the opposite. By design, our freedoms are rooted in those spaces beyond the government’s reach.
The U.S. Constitution protects our rights to free speech, a free press and peaceful assembly with these words: “Congress shall make no law . . .” Article I of the Michigan Constitution begins: “All political power is inherent in the people.”
Balancing the boundaries
In the real world, this plays out as a balancing act and governance is largely a conversation about the boundaries of government action. When it comes to law enforcement, the government is always pushing boundaries, seeking authority to do more with less oversight. We must constantly ask ourselves how much power are we, the people, willing to give the government to keep us safe? And what are the risks of the government taking more than we are willing to give?
It doesn’t take much imagination to see how law enforcement might push the boundaries too far. In fact, it doesn’t take any imagination at all.
Google the words COINTELPRO, NSA warrantless surveillance and Ghetto Informant Program — we’ve already seen the technology abused. In 2015, the ACLU found Baltimore police used facial recognition to identify and arrest protesters with outstanding warrants who had taken to the streets after the death of Freddie Gray. In Detroit, a city known for civil rights activists, labor unions, immigrant rights organizations, and a thriving Muslim community, how long before something similar happens?
In 2016, Duggan said: “We’re going to be able before too long to match outstanding warrants against these cameras.” Though that’s not the stated goal today, how long before it is?
Policies and procedures for facial recognition technology being proposed by the DPD are wholly inadequate. But even if they were more comprehensive, once the infrastructure is in place, there is no turning back. That’s why cities across the country are banning this technology. That’s why Michigan legislators seek a pause in the adoption of facial recognition technology.
The software poses a unique threat to the pillars of freedom – privacy and protest – and once we’ve destroyed them, they won’t be easy to rebuild.