I don’t know why this took me by surprise. Maybe because I’m not a big sci-fi reader — with exceptions made for the likes of Isaac Asimov and Philip K. Dick and Stanislaw Lem and William Gibson. And Robert Heinlein and Mary Shelley and H.G.Wells and Ray Bradbury. And Margaret Atwood and Neil Stephenson. And how could I have not led with one of my literary heroes, Jorge Luis Borges? But still, you get my point (I think), which is that I don’t usually think in terms of science fiction becoming applied science.
More fool I.
Last week, Christof Heyns, the man burdened with the unenviable title of “United Nations special rapporteur on extrajudicial, summary, or arbitrary executions,” called for a global moratorium on the testing, production, and use of armed robots that can select and kill targets without human command.
They are known as “lethal autonomous robots.” And yes, this is indeed a nightmarish killer-robot movie come marching off the screen and into all too non-virtual reality.
Yet the report of Heyns’s call didn’t even make the front page of America’s “newspaper of record.” Soothingly buried on an inside page of the New York Times, and calmingly including the reassurance that such robots weren’t “yet” in production, it elicited little comment. It seems our alarm systems have been lulled by the use of drones, so conveniently deployed halfway round the world in all sorts of places most Americans can’t even find on a map.
Drones, it’s now clear, are only the warm-up stage. Think of lethal autonomous robots as drones with minds of their own. Just program them and set them loose, secure in the knowledge that nothing can possibly go wrong. No way their electronics will go haywire. No way they’ll become just a little bit too autonomous. With the kind of fail-safe electronics that exist only in android dreams, humans can sleep secure. So long as they’re not the targets.
But wait just a moment: who gets to say who the targets are? Who’s going to program the robots? And according to what criteria? Will they be programmed to search out “suspicious behavior,” as human drone operators do? But then what makes behavior suspicious? The skin color of the person doing the behaving? Anyone with a beard? Anyone moving too fast, or maybe too slow? In too large a group or suspiciously alone? Animal, vegetable, or mineral?
Amnesty International and Human Rights Watch are all over this, leading a new coalition of groups in the Campaign to Stop Killer Robots, officially launched just two weeks ago.
But it seems to me that an important question to ask here is this: Who is going to be raking in the billions on these robots? Who exactly is doing the research and testing, and will presumably get the huge military contracts? Consider this report last year from San Diego public radio station KPBS on who’s profiting from the $12 billion drone industry (yes, you read the last five words correctly — that’s for the years 2005 to 2011). The top three? How could you possibly not guess? Lockheed Martin, Boeing, and Northrup Grumman. It’s enough to make me ashamed of ever having gotten my pilot’s wings.
And then consider the lengthy, detailed report on the military robot market (mind-numbingly referred to as “Military Ground Robot Mobile Platform Systems of Engagement”) prepared by an outfit called WinterGreen Research. Here, in the kind of mangled grammar that seems to accompany lip-smacking anticipation, is a short extract from the press release:
Even as the US presence in Iraq and Afghanistan winds down, automated process implemented as mobile platform systems of engagement are being used to fight terrorists and protect human life. These robots are a new core technology in which all governments must invest. Military ground robot market growth comes from the device marketing experts inventing a new role as technology poised to be effective at the forefront of fighting terrorism. Markets at $4.5 billion in 2013 reach $12.0 billion by 2019. Growth is based on the adoption of automated process by military organizations worldwide.
I disagree that defense contractor greed is the main force driving the change to UAVs, although they certainly don’t want to be left out of the new market.
Unmanned systems are being driven by a very compelling dynamic – they allow the “War on Terror” to proceed, even as our troop strength is drawn down and finances become more constrained. No need for tough decisions containing medical care costs for the Armed Forces. No need for system procurement reform to prevent over priced, overly complex manned systems like the F-22 or F-35. Congress can cut the defense procurements overall, while still specifying pet programs and bases be kept open over the DoD planners’ objections.
Consider the 2012 allocations in http://comptroller.defense.gov/defbudget/fy2012/FY2012_Weapons.pdf, for example. It shows $2.9B for the V-22 Osprey, which has had numerous safety issues during development, and $9.5B for the F-35. These are much higher “average sales price” items than the UAVs, the 2012 allocations for these two programs equal the $12B UAV market cap in 2019. The “profit motive” of Defense Companies would dictate more of these high priced, high margin systems, not the lower priced UAVs whose new technologies and smaller investments make the entrance of new competition possible.
No, I don’t think Defense Contractor greed is driving the change, as you imply,
It’s Congress, who can avoid making policy changes on national defense, can avoid reforming defense procurement, can avoid making budget changes for sustainable medical care of our troops, and still rely on the President to order drone killings to advance the War on Terror. They can claim to their constituents they are being kept safe and they’re supporting the troops, while doing nothing in legislation to actually help the troops or make us any safer.
“If a politician found he had cannibals among his constituents, he would promise them missionaries for dinner.” H.L. Mencken
In this case, every UAV approved by Congress is that much money saved on equipment and personnel costs, so they can continue to serve their constituents the “payola” of bases and production lines not wanted the Armed Forces. They can prosecute war through drones and have none of the policy brakes that come with body bags and wounded warriors. Their voters are happy, because American “greatness” is projected globally. Meanwhile, all the downsides of war have been “outsourced” to Pakistani, Yemini and Afghan civilians.
Thanks John — you argue your points so well that I agree with your disagreement. When it comes to political decision-making, any remnant of rationality seems to go out the window as soon as the word ‘terrorism’ is uttered. We’re still stuck in the George W. Bush era.
Meanwhile, I’m struck by how little comment there’s been (here and elsewhere) on this issue. Heyns issues a wake-up call, and nearly everyone hits the snooze button. It’s as though we can’t quite grasp what autonomous drones are (in fact most of us don’t really grasp what the guided ones currently in use are). Either that, or we just don’t care so long as they don’t turn on us. Which of course, one way or another, they will.