While you are most likely not a criminal, it is likely that the police have a digital version of your face on file. This is because most states put driver’s license photos into a database accessible to the police—and, one would assume, the federal government. The system works in conjunction with facial recognition software (like Facebook uses to identify people in your photos) to identify suspects. For example, if someone robs a convenience store and the police do not recognize them, the image from the surveillance camera can be matched against the database that could contain your face. Ideally, the software would generate a short list that includes the perpetrator. Problematically, the software could generate a list of innocent people who might then end up in unpleasant interactions with the state.
There are, of course, some practical issues with the current technology. One is that the photos the police have of suspects tend to be of low quality, thus making false matches more likely. Another is that in such a large database there will be many people who look alike, thus the chance of false matches will be high even with good photos. As anyone familiar with the DMV knows, driver’s license photos also tend to vary greatly in quality and consistency, thus making false matches likely.
The current software also has problems with people who have darker skin, thus making false matches more likely for people of color than white people. While some might suspect racism or bias at work, it has been claimed that this occurs because darker skin has less contrast than lighter skin, making accurate matches more difficult. If this technical issue cannot be solved, then it is almost certain that there will be charges of racism and bias as more dark skinned people are subject to false matches than lighter people. Even if this is purely a technical issue with no actual bias, it would certainly create the impression of bias and feed into the view that policing is biased in America. It also raises a moral concern about the use of such software in terms of its consequences: while it might have the benefit of assisting the police in finding actual criminals, it could have the harm of fanning the existing flames of mistrust and worries about police bias against people of color. These factors would need to be balanced against each other, at least until the recognition disparity is solved.
In addition to specific concerns about the recognition of darker skinned people, there is the general concern about the accuracy of the software in identifying people. Since most people with driver’s licenses will be in the database, innocent people will end up being investigated by the police because the software pegs them as adequately resembling a suspect. While most interactions with the police would presumably be quick and harmless, interactions with the state can go very badly indeed—even for innocent people. As such, due moral consideration should be paid to this fact.
There are, of course, the usual concerns about privacy and intrusion of the state. While some citizens are terrified of the idea of a national database of guns, what is being constructed is an even more invasive database—a database of our faces. A “facebase”, if you will. As such, those who are dedicated to Second Amendment rights should be worried about this “facebase.” Others who are concerned about privacy and the overreach of big government should also be worried and insist that proper controls and limitations are in place to protect the rights of citizens.
It could be countered that people with nothing to hide have nothing to fear—but this slogan fails to address the legitimate concerns about privacy. After all, no one who is worried about a national database of guns would be content with being told that if they have their guns legally, then they have nothing to fear from such a database.
A better counter is to appeal to the positive consequences. That is, by giving up privacy rights and becoming part of a “perpetual lineup” we will be safer from criminals and terrorists. This argument does have considerable appeal—but it must be assessed properly in terms of what the approach yields in benefits and what it costs in terms of intrusion and other harms. Americans have, in general, been far too quick to give away real rights and suffer real harms in return for the illusion of safety. We should stop doing this. One useful approach would be to imagine that what is being given up is a right a person has a strong emotional attachment to—this would help offset the emotional appeal to fear of criminals and terrorists. For example, a pro-gun person could imagine that the system was creating a database of his guns to match up against guns supposedly used by terrorists or criminals. This tactic obviously has no logical weight—it is merely intended as counter to emotional manipulation by means of an analogy.
A final concern, as with all such gather of data, is worry about the various potential misuses of the information. I would assume that these databases have already been hacked and the information is now being examined by foreign governments, criminals and terrorists. Because of this, we should consider the consequences of maintaining or expanding the program. After all, whatever ends up in our databases inevitably ends up around the world. There are also concerns that the data would be made available to the private sector for use in advertising, political campaigning and other purposes. This is not a concern unique to the “facebase” but it is still a matter of concern.
In closing, that bad DMV photo might prove to be a blessing or a curse. On the positive side, it might be so bad that the police will not be able to match you should you commit a crime. On the negative side, that bad photo might get you matched up often and thus subject to friendly inquiries from the police. But, you might make new friends or get to see how a taser works.