Media reports of developments in so-called robotic weapons systems (a broad category that includes any system involving some degree of pre-programming as well as remote control) are haunted by the question of ‘autonomy’; specifically, the prospect that technologies acting independently of human operators will run ‘out of control’ (a fear addressed by Langdon Winner in his 1977 book Autonomous Technology: technics-out-of-control as a theme in political thought). While recognizing the very real dangers posed by increasing resort to on-board, algorithmic encoding of controls in military systems, I want to track the discussion of autonomy with respect to weapons systems a bit more closely. A recent story in the LA Times, noted and under discussion by my colleagues in the International Committee for Robot Arms Control (ICRAC), provides a good starting place.
While I’m going to suggest here that autonomy is something of a red herring in the context of this story, let me be clear at the outset that I believe that we should be deeply concerned about the developments reported. They represent a continuation of the longstanding investment in automation in the (questionable) interest of economy; the dangers of ever-intensified speed in war fighting; the extraordinary inflation of spending on weapons systems at the expense of other social spending (see post Arming Robots); and the threat to global security of the already existing infrastructure of networked warfare. With that said, I want to question the framing of the developments reported in this article as the beginning of something new, unprecedented and (as often goes along with these adjectives) inevitable, centering on the question of autonomy.
The article reports on the X47B drone, a demonstration aircraft currently being tested by the Navy at a cost of $813 million.
“The X-47B drone, above, marks a paradigm shift in warfare, one that is likely to have far-reaching consequences. With the drone’s ability to be flown autonomously by onboard computers, it could usher in an era when death and destruction can be dealt by machines operating semi-independently.” (Chad Slattery, Northrop Grumman / January 25, 2012)
A major technical requirement for this plane is that it should be able to land under on board controls on the deck of an aircraft carrier, “one of aviation’s most difficult maneuvers.” In this respect, the X47B is a next logical step in an ongoing process of automation, of the replacement of labour with capital equipment, through the delegation of actions previously done by skillful humans to machines. The familiarity of the story in this respect raises the question: what exactly is the “paradigm shift” here? And what are the stakes in the assertion that there is one? The author observes:
“With the drone’s ability to be flown autonomously by onboard computers, it could usher in an era when death and destruction can be dealt by machines operating semi-independently.”
Most commercial aircraft, as well as existing drones, can be put under ‘auto pilot’ controls, and are always operating ‘semi-independently.’ And the U.S. drone campaign is already dealing death and destruction.
“Although humans would program an autonomous drone’s flight plan and could override its decisions, the prospect of heavily armed aircraft screaming through the skies without direct human control is unnerving to many.”
Aren’t populations in Pakistan, Afghanistan, Yemen and other areas that are the target of U.S. drones already unnerved by heavily armed aircraft screaming through the skies? And to what extent has ‘direct human control’ over existing drone systems ensured that civilians won’t be killed, whether as a consequence of mistaken targeting, or what seems to be accepted within military procedure as unavoidable ‘collateral’ damage?
“‘The deployment of such systems would reflect … a major qualitative change in the conduct of hostilities,’ committee [of the International Red Cross] President Jakob Kellenberger said at a recent conference. ‘The capacity to discriminate, as required by [international humanitarian law], will depend entirely on the quality and variety of sensors and programming employed within the system.’”
It is clear that the ‘capacity to discriminate’ is already based on complex networks of sensors and code, and the history of the use of armed drones includes recurring examples of misrecognition of targets, extra-judicial killing, and a range of other violations of international law.
“Weapons specialists in the military and Congress acknowledge that policymakers must deal with these ethical questions long before these lethal autonomous drones go into active service, which may be a decade or more away.”
These questions – not only ethical but also moral and legal – must equally have been dealt with before lethal remotely-controlled drones went into active service. Which means that the latter are, in their current use, unethical, immoral and illegal.
“More aggressive robotry development could lead to deploying far fewer U.S. military personnel to other countries, achieving greater national security at a much lower cost and most importantly, greatly reduced casualties,” aerospace pioneer Simon Ramo, who helped develop the intercontinental ballistic missile, wrote in his new book, “Let Robots Do the Dying.”
The promise of lower cost rings hollow in the context of a defense budget that continues to grow, and the prediction that annual global spending on drones will double to $11.5 billion in the next few years (reported by the New Internationalist in their December 2011 issue). But ‘most importantly,’ as Ramo puts it, the ‘reduction in casualties’ refers only to ‘our’ side, and it is not only robots that are dying.
The Air Force says in the Unmanned Aircraft Systems Flight Plan 2009-2047 that “it’s only a matter of time before drones have the capability to make life-or-death decisions as they circle the battlefield.” What’s missing from this projection (we should be suspicious whenever we hear ‘it’s only a matter of time’) are the unresolved problems of decision-making that plague already existing armed drone systems. The focus on the future ignores the already unacceptable present. And the focus on autonomy as the threat directs our attention away from the autonomous arms-industry-out-of-control, of which the X47B is a symptom.