A Philosopher's Blog

The Ethics of Stockpiling Vulnerabilities

Posted in Business, Ethics, Philosophy, Politics, Technology by Michael LaBossiere on May 17, 2017
Embed from Getty Images

In May of 2017 the Wannacry Ransomware swept across the world, impacting thousands of computers. The attack affected hospitals, businesses, and universities and the damage has yet to be fully calculated. While any such large-scale attack is a matter of concern, the Wannacry incident is especially interesting. This is because the foundation of the attack was stolen from the National Security Agency of the United States. This raises an important moral issue, namely whether states should stockpile knowledge of software vulnerabilities and the software to exploit them.

A stock argument for states maintaining such stockpiles is the same as the argument used to justify stockpiling weapons such as tanks and aircraft. The general idea is that such stockpiles are needed for national security: to protect and advance the interests of the state. In the case of exploiting vulnerabilities for spying, the security argument can be tweaked a bit by drawing an analogy to other methods of spying. As should be evident, to the degree that states have the right to stockpile physical weapons and engage in spying for their security, they also would seem to have the right to stockpile software weapons and knowledge of vulnerabilities.

The obvious moral counter argument can be built on utilitarian grounds: the harm done when such software and information is stolen and distributed exceeds the benefits accrued by states having such software and information. The Wannacry incident serves as an excellent example of this. While the NSA might have had a brief period of advantage when it had exclusive ownership of the software and information, the damage done by the ransomware to the world certainly exceeds this small, temporary advantage. Given the large-scale damage that can be done, it seems likely that the harm caused by stolen software and information will generally exceed the benefits to states. As such, stockpiling such software and knowledge of vulnerabilities is morally wrong.

This can be countered by arguing that states just need to secure their weaponized software and information. Just as a state is morally obligated to ensure that no one steals its missiles to use in criminal or terrorist endeavors, a state is obligated to ensure that its software and vulnerability information is not stolen. If a state can do this, then it would be just as morally acceptable for a state to have these cyberweapons as it would be for it to have conventional weapons.

The easy and obvious reply to this counter is to point out that there are relevant differences between conventional weapons and cyberweapons that make it very difficult to properly secure them from unauthorized use. One difference is that stealing software and information is generally much easier and safer than stealing traditional weapons. For example, a hacker can get into the NSA from anywhere in the world, but a person who wanted to steal a missile would typically need to break into and out of a military base. As such, securing cyberweapons can be more difficult that securing other weapons. Another difference is that almost everyone in the world has access to the deployment system for software weapons—a device connected to the internet. In contrast, someone who stole, for example, a missile would also need a launching platform. A third difference is that software weapons are generally easier to use than traditional weapons. Because of these factors, cyberweapons are far harder to secure and this makes their stockpiling very risky. As such, the potential for serious harm combined with the difficulty of securing such weapons would seem to make them morally unacceptable.

But, suppose that such weapons and vulnerability information could be securely stored—this would seem to answer the counter. However, it only addresses the stockpiling of weaponized software and does not justify stockpiling vulnerabilities. While adequate storage would prevent the theft of the software and the acquisition of vulnerability information from the secure storage, the vulnerability would remain to be exploited by others. While a state that has such vulnerability information would not be directly responsible for others finding the vulnerabilities, the state would still be responsible for knowingly allowing the vulnerability to remain, thus potentially putting the rest of the world at risk. In the case of serious vulnerabilities, the potential harm of allowing such vulnerabilities to remain unfixed would seem to exceed the advantages a state would gain in keeping the information to itself. As such, states should not stockpile knowledge of such critical vulnerabilities, but should inform the relevant companies.

The interconnected web of computers that forms the nervous system of the modern world is far too important to everyone to put it risk for the relatively minor and short-term gains that could be had by states creating malware and stockpiling vulnerabilities. I would use an obvious analogy to the environment; but people are all too willing to inflict massive environmental damage for relatively small short term gains. This, of course, suggests that the people running states might prove as wicked and unwise regarding the virtual environment as they are regarding the physical environment.


My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter


Smart Classrooms & Infections

Posted in Technology by Michael LaBossiere on January 31, 2011
USB flash drive SanDisk
Image via Wikipedia

While I have been integrating technology into my classes since graduate school (my first creation was a Supercard program that incorporated notes and tutorials into a self contained package) it was only the past fall that I was actually assigned to a smart classroom. Half of my classes are still in a dumb classroom. In fact, it is very dumb: it is a converted band room in the old high school associated with the campus (the high school students are now in a new, much nicer complex).

Using a smart classroom is easy enough-they typically just involve a PC serving as a “hub” for various media devices (VCR, DVD player, etc.) and that is also connected to a projector. Most people just use PowerPoint or show web sites via a browser. Of course, some people just like having the rooms and do not even use the “smart” features.

One obvious problem with the smart classrooms is the fact that the PCs have to be accessible to all the professors who use the room. So, for example, anyone can plug in a malware infested USB key or pickup various nasties from web sites. Interestingly enough, the PCs I have seen are lacking in security software, other than the Windows 7 firewall.  Not surprisingly, I have noticed that they have problems with malware.  Since I do not want to get malware on my well maintained PCs, I have worked out some strategies for dealing with the fact that the classroom PCs seem to be roughly the equivalent of a public urinal.

One obvious approach is to try to upgrade the security. However, most classroom PCs are password protected to keep people from installing software (well, in theory anyway). One easy way around this is to use Ophcrack-a free program that can be used to garner the passwords on a Windows machine. With enough time, it would be possible to get the password for the administrator account, log in and then install security software such as the free Avast software and the excellent free Comodo firewall.  Useful free software is also available at Ninite. Of course, the IT folks might frown on such behavior-although they should probably have taken steps to secure the PCs from the get go. If you don’t have the time to crack the password, one option is to use portable software to clean the PC. While this will not be an optimal solution, it can be better than nothing. PortableApps.com has some basic security programs that can be run without actually being installed. As such, you can run them from a CD, removable drive or by copying or by downloading them to the PC.

A second obvious approach is to keep the files you need on your own website. That way you can simply load a webpage or download the files to the PC without worrying about infections.  Obviously, you do not want to use a password protected online file storage (like Skydrive) from the PC-it might have a keylogger installed. However, sites that allow public access would be fine (keep in mind that folks will be able to get to your files).

There are also some online anti-virus programs, such as Panda ActiveScan, that can be run from a web browser. While an installed security suite or set of programs would be better, an online scan is better than nothing.

Of course, the PCs internet access might be down (or non-existent) or perhaps downloading is not an option. If so, another approach would be needed.

A third approach is to burn a CD with your files on it. Be sure that the disk is “closed” so that nothing more can be written to it. On the downside, you’ll have to buy a CD (although this is cheap) and create new ones when you change your files. However, this is a rather secure option.

A fourth approach is to get a USB drive that has a hardware write protect switch. All of my older drives have this but none of my newer drives do (although there are apparently some software write protect options). If you are buying one for this purpose, be sure to confirm that it has such a switch. This allows you to change or update files as needed, yet be reasonably safe from the perils of the smart classroom PC.

As a another option, if your smart classroom has a document camera, you can print your class material and use that camera. The only infections you have to worry about then are those you might pick up from touching the mouse.

Enhanced by Zemanta