A Philosopher's Blog

Facebase

Posted in Ethics, Philosophy, Politics, Race by Michael LaBossiere on October 28, 2016

While you are most likely not a criminal, it is likely that the police have a digital version of your face on file. This is because most states put driver’s license photos into a database accessible to the police—and, one would assume, the federal government. The system works in conjunction with facial recognition software (like Facebook uses to identify people in your photos) to identify suspects. For example, if someone robs a convenience store and the police do not recognize them, the image from the surveillance camera can be matched against the database that could contain your face. Ideally, the software would generate a short list that includes the perpetrator. Problematically, the software could generate a list of innocent people who might then end up in unpleasant interactions with the state.

There are, of course, some practical issues with the current technology. One is that the photos the police have of suspects tend to be of low quality, thus making false matches more likely. Another is that in such a large database there will be many people who look alike, thus the chance of false matches will be high even with good photos. As anyone familiar with the DMV knows, driver’s license photos also tend to vary greatly in quality and consistency, thus making false matches likely.

The current software also has problems with people who have darker skin, thus making false matches more likely for people of color than white people. While some might suspect racism or bias at work, it has been claimed that this occurs because darker skin has less contrast than lighter skin, making accurate matches more difficult. If this technical issue cannot be solved, then it is almost certain that there will be charges of racism and bias as more dark skinned people are subject to false matches than lighter people. Even if this is purely a technical issue with no actual bias, it would certainly create the impression of bias and feed into the view that policing is biased in America. It also raises a moral concern about the use of such software in terms of its consequences: while it might have the benefit of assisting the police in finding actual criminals, it could have the harm of fanning the existing flames of mistrust and worries about police bias against people of color. These factors would need to be balanced against each other, at least until the recognition disparity is solved.

In addition to specific concerns about the recognition of darker skinned people, there is the general concern about the accuracy of the software in identifying people. Since most people with driver’s licenses will be in the database, innocent people will end up being investigated by the police because the software pegs them as adequately resembling a suspect. While most interactions with the police would presumably be quick and harmless, interactions with the state can go very badly indeed—even for innocent people. As such, due moral consideration should be paid to this fact.

There are, of course, the usual concerns about privacy and intrusion of the state. While some citizens are terrified of the idea of a national database of guns, what is being constructed is an even more invasive database—a database of our faces. A “facebase”, if you will. As such, those who are dedicated to Second Amendment rights should be worried about this “facebase.” Others who are concerned about privacy and the overreach of big government should also be worried and insist that proper controls and limitations are in place to protect the rights of citizens.

It could be countered that people with nothing to hide have nothing to fear—but this slogan fails to address the legitimate concerns about privacy. After all, no one who is worried about a national database of guns would be content with being told that if they have their guns legally, then they have nothing to fear from such a database.

A better counter is to appeal to the positive consequences. That is, by giving up privacy rights and becoming part of a “perpetual lineup” we will be safer from criminals and terrorists. This argument does have considerable appeal—but it must be assessed properly in terms of what the approach yields in benefits and what it costs in terms of intrusion and other harms. Americans have, in general, been far too quick to give away real rights and suffer real harms in return for the illusion of safety. We should stop doing this. One useful approach would be to imagine that what is being given up is a right a person has a strong emotional attachment to—this would help offset the emotional appeal to fear of criminals and terrorists. For example, a pro-gun person could imagine that the system was creating a database of his guns to match up against guns supposedly used by terrorists or criminals. This tactic obviously has no logical weight—it is merely intended as counter to emotional manipulation by means of an analogy.

A final concern, as with all such gather of data, is worry about the various potential misuses of the information. I would assume that these databases have already been hacked and the information is now being examined by foreign governments, criminals and terrorists. Because of this, we should consider the consequences of maintaining or expanding the program. After all, whatever ends up in our databases inevitably ends up around the world. There are also concerns that the data would be made available to the private sector for use in advertising, political campaigning and other purposes. This is not a concern unique to the “facebase” but it is still a matter of concern.

In closing, that bad DMV photo might prove to be a blessing or a curse. On the positive side, it might be so bad that the police will not be able to match you should you commit a crime. On the negative side, that bad photo might get you matched up often and thus subject to friendly inquiries from the police. But, you might make new friends or get to see how a taser works.

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Advertisements

7 Responses

Subscribe to comments with RSS.

  1. TJB said, on October 28, 2016 at 4:10 pm

    The current software also has problems with people who have darker skin, thus making false matches more likely for people of color than white people.

    And also more room for “reasonable doubt” if things go to trial.

    • wtp said, on October 30, 2016 at 1:28 pm

      The Gell-Mann effect at work here. Having worked in the biometrics identification business I have some tangential experience with facial recognition and such. Suffice to say there are many different approaches to identification including differences in how facial recognition is processed, so to speak of “the current software” as if there is only one way to do so is, as typical of Mike’s posts, very narrow minded. Some actually would work to the advantage of darker skinned people. But in general, one that I attended a demo of used ratios between a handful of specific points, tip of the nose, center of the pupils, tips of the ears, corners of the mouth, etc. to compute ratios between them. When matches came up in the test database, to the human eye you would not have guessed that there was any relationship between the pictures chosen as other features such as hair, skin tone, etc distracted the human eye from what the software computed. As used in the field, when possible, which is much of the time, biometrics are mated with other information such as sex, weight, height, etc. both to shorten the biometric processing time, which unlike on CSI and such is considerable, but also to confirm or exclude suspects. And this is not even considering other evidence of location, circumstance, motive, etc. A match on a singular biometrics search, be it true or false, is not the only thing a prosecutor would take to court. This is just another of Mike’s straw men.

      BTW TJ, I was thinking about these straw man arguments. You do understand that by their very nature they are a tell on the bias of the individual presenting them? They are a conclusion to be reached as defined before the investigation. Logically speaking, Mike commits far worse in his posts than any damage from the use, or even misuse of biometrics is likely to do. Either to the facts or to logical conclusions.

      • TJB said, on October 30, 2016 at 3:38 pm

        Agreed. More like theology than philosophy.

  2. ajmacdonaldjr said, on October 28, 2016 at 4:49 pm

    Facial recognition is only one aspect of the database. Everything we do is collected and mapped out in a timeline. That’s what the NSA has been doing for the past 15 years. NSA Whistle-Blower Tells All: The Program | Op-Docs | The New York Times https://youtu.be/r9-3K3rkPRE

  3. nailheadtom said, on October 31, 2016 at 11:07 am

    The facial recognition process is another example of “scientism”, the current belief that technology is capable of infallibility. As used in law enforcement even fingerprints aren’t a positive method of identification. DNA identification has proven to be a joke. Breathalzyers aren’t a reliable tool.

    What’s interesting is the obsession with personal identification. Why is that? Why does government have to know as much as possible about each of its subjects? The government wants to know who you are and what you’ve done but has no obligation to provide similar information to you. The citizenry doesn’t know who the members of the secret police are or where they live, or that of any other powerful figures. It’s allowed to keep secret whatever it wishes. Isn’t that just a little problematical?

    • Michael LaBossiere said, on October 31, 2016 at 6:41 pm

      The claim is that tracking us will make us safer (give up liberty to gain security). While I do see the value in using technology to improve safety, we should always be critical and consider whether the safety gained is real and if it is worth the cost to our rights.

      • nailheadtom said, on November 2, 2016 at 7:38 pm

        The safest people on the planet are in solitary confinement. How interesting and enjoyable would a life with zero risk be? Actually, it would be similar to that of a beef feeder steer. Completely safe until reaching 1000 succulent pounds.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: