I haven’t discussed anything about Wizbang Blue in a little while, but I didn’t want the authors over there to think I’ve forgotten about them. In recent days, they’ve put up two postings on a topic near and dear to my heart — advances in military technology — and I want to use their pieces as jumping-off points for my own thoughts on the subjects.
First up, Larkin discussed the latest in military robotics — the SWORDS robot. (In the latest example of tortuous acronyms, “SWORDS” stands for “Special Weapons Observation Reconnaissance Detection System.”) SWORDS is the result of strapping weapons on to a robot originally constructed for finding and disarming mines and other explosives.
Larkin recognizes that the SWORDS robot is not the stereotypical “robot” as most people consider it — an autonomous machine that acts independently within the parameters of its programming. Rather, it is a machine that operates by remote control, directly under the guidance of a human being who dictates its every action.
This is a GOOD thing.
The purpose of the military is to carry out the lawful orders of the United States government. Its primary manner is the use of force, both threatened and actual. And it is unparalleled in its efficacy in that regard.
The purpose of the military is not to “kill people and break things.” That is the way in which it carries out its purpose. The military does not enjoy doing these things, for the most part; it simply accepts that as part of its duty.
Moreover, war is not about “fairness.” As the late Col. David Hackworth famously said, “If you find yourself in a fair fight, you didn’t plan your mission properly.” If we can fight and kill the enemy without putting our own soldiers at risk, then ABSOLUTELY we should do so. SWORDS is just such a tool, and we would be foolish to refuse it on such grounds.
I have seen one criticism of using such tools that is worthy of consideration: that by keeping our troops safer from danger (NOT safe, but safer — currently, the SWORDS’ controller must be within 1000 meters of the unit), we are, in essence, stripping that operator of some of the protections of the laws of warfare. Many of those laws allow the individual troop some leeway in their conduct out of recognition of the “heat of battle,” and understand that quick judgments are essential to battlefield survival. If, in retrospect, some of those decisions were in error, it’s understandable because the individual was in justifiable fear of his life and acted accordingly.
SWORDS reduces the “imminent” aspect of the danger. The controlling soldier can afford to take a few more seconds to evaluate the situation before making the “shoot/no shoot” decision, because all that is at risk is a machine, not a human being.
This is a step in the direction of what Larkin fears — independent, autonomous war machines, roaming the battlefield like the soulless killing monstrosities from the “Terminator” movies, striking without mercy, without restraint, without compunction.
The human element of warfare is something that we must — eventually — confront. How much direct human control must be kept over the weapons of war, and how much we can trust to automation and machine logic.
But SWORDS is a very, very small step in that direction. And those who see it as a step towards the dehumanizing of our military need to take a good, hard look at their own perceptions of that military — and the men and women who will be spared risks by its use.