A Philosopher's Blog

Heavy Weapons, Brain Injuries & Ethics

Posted in Ethics, Philosophy by Michael LaBossiere on April 12, 2017
Embed from Getty Images

Some years ago, I was firing my .357 magnum at an indoor range. This powerful pistol mades a satisfying “bang” and hurled a piece of metal at lethal speeds towards the paper target. Then there was a much louder noise and I felt a “whuummmp” vibrating my ribcage. My friend Ron was firing his .44 magnum nearby, close enough for me to feel the shockwave from the weapon.

While the .44 magnum is a powerful handgun (just ask Dirty Harry), it is a mere peashooter compared to a weapon like the Carl-Gustav M3, a shoulder fired heavy infantry weapon. When fired, this weapon generates a strong shockwave that might be causing brain injuries to the operators. While a proper scientific study has not been conducted on the effects of operating such weapons, it makes sense that they could cause such injuries. After all, the shockwave from the weapon is certainly analogous to that produced by other explosions, such as the IEDs that have caused terrible injuries. While IEDs certainly inflict wounds via the shrapnel and explosive burst, their shockwaves can also inflict brain damage without otherwise leaving a mark on the target.

The United States military had been gathering data using small blast gauges worn by soldiers. However, the use of the gauges was discontinued when it was claimed they could not consistently indicate when a soldier had been close enough to an explosion to suffer a concussion or mild traumatic brain injury. These gauges did, however, provide a wealth of information—including data that showed infantry operating heavy weapons were being repeatedly exposed to potentially dangerous levels of overpressure. Because such data could be used to link such exposure to long term health issues in soldiers, it might be suspected that the Pentagon stopped collecting data to avoid having to accept fiscal responsibility for such harms. This can, obviously enough, be seen as analogous to the NFL’s approach to concussions. This leads to some clear moral concerns about monitoring the exposure of operators and the use of heavy infantry weapons.

While it might seem awful, a moral argument can be made for not gathering data on soldiers operating heavy weapons. As noted above, if it were shown that being exposed to the overpressure of such weapons can cause brain injuries, then the state could incur the expenses associated with such responsibility. Without such data, the state can maintain that there is no proof of a connection and thus avoid such expenses. From a utilitarian standpoint, if the financial savings outweighed the harms done to the soldiers, then this would be the right thing to do. However, intentionally evading responsibility for harm does seem morally problematic, at best. It can also be countered that the benefits of being aware of the damage being done outweigh the benefits of an intentional ignorance. One obvious benefit is that such data could help mitigate or eliminate such damage and this seems morally superior to the intentional evasion by willful ignorance.

While there do seem to be steps that could be taken to minimize the damage done to troops operating heavy weapons (assuming there is such damage), it is likely that such damage cannot be avoided altogether. That is, there will always be some risk to the operators and those near. One technological solution would be to remotely operate heavy weapons (thus allowing the operator to be out of the damage zone). Another technological solution would be to automate such heavy weapons, thus taking humans out of the danger zone. Either of these options would increase the cost of the weapon system and would thus require weighing the financial cost against the wellbeing of soldiers. Fortunately, many of those who are fiscal conservatives when it comes to human wellbeing are fiscal liberals when it comes to corporate profits, so one way to sell the idea is to ensure that it would be profitable to corporations. There is also a moral argument that can be made for using the weapons as they are, even if they are harmful to the operators. It is to this that I now turn.

From a utilitarian standpoint, the ethics of exposing operators to damage from their own weapons would be a matter of weighing the harm done to the operators against the benefits of using such heavy weapons in combat. Infantry operated heavy weapons do seem to be very useful in combat. One obvious benefit of such weapons is that they allow infantry to engage vehicles, such as tanks and aircraft, with a reasonable chance of success. Taking on a tank or aircraft with light weapons generally does not turn out in the infantry’s favor. As such, if the choice is between risking some overpressure damage or facing a much greater risk of being killed by enemy vehicles, then the choice is obvious. As such, if the effectiveness of the weapon against the enemy adequately outweighs the risk to the operator, then it would be morally acceptable for the operators to take that risk.  There is, however, still the question of the damage suffered during practice with the weapons.

The obvious way to argue that it is acceptable for troops to risk injury when training with heavy weapons is that they will need this practice to use the weapon effectively in combat. If they were to try to operate a heavy weapon without live practice, they would be far less likely to be effective and thus more likely to fail and be injured or killed by the enemy (or their own weapon). As such, the harm of going into battle without proper training morally outweighs the harm suffered by the operators in learning the weapon. This, of course, assumes that they are likely to end up in battle. If the training risks are taken and the training is not used, then the injury would have been for nothing—which takes this into the realm of considering odds in the context of ethics. On approach would be to scale training based on the likelihood of combat, scaling up if action is anticipated and keeping a minimal level when action is unlikely.

Making rational choices about the risks does, obviously enough, require knowing the risks. As such, there must be a proper study done of the risks of operating such weapons. Otherwise the moral and practical calculations would be essentially guessing, which is morally unacceptable.

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter