“We have not let technology run away with our ethics.” So says a senior British Army Officer in the UK’s Ministry of Defence when quizzed about drones; the catch-all descriptor for unmanned aerial vehicles. But his comment hints at a nervousness in senior political and military circles of an erosion of public support due to the state’s perceived heavy-handed use of technology.
Drones have been getting a bad press lately. Once hailed as the antidote to messy ground-holding military deployments they now epitomise that sensitive and contested area where technology and ethics overlap. The Bureau of Investigative Journalism claims drones were responsible for between 282 and 535 civilian deaths in President Obama’s first three years in office. But politicians, wary of a public prone to rapid judgements when flag-draped coffins fill the front pages and impatient for “quick wins” as Philip Hammond, the former British Defence Secretary, explained to the Munich Security Conference on February 1st 2014, regularly hail a technology deemed invulnerable and, more importantly, precise.
But there are two problems with claims of precision. First, in a military context it implies infallibility. True, technical sophistication has resulted in a marked increase in the use of so-called precision guided munitions (PGM): in Operation Desert Storm in 1991 8% of US aerial attacks employed PGMs; in Iraq in 2003 it was 68%. But they do not always land where intended as much can influence a munition in flight. Weather, faulty technology and human error in the construction of the weapon, weapon platform or targeting equipment can all influence where a payload lands, regardless of how stable the cross-hairs have been held on a target.
The second problem is the suggestion that the West marks its own homework when it comes to accounting for the employment of lethal technology. The military uses the term Circular Error Probable (CEP) to describe precision, defining it as a circle radius around a target within which 50% of weapons should fall. Modern PGMs have very small CEPs, usually less than 13 metres, achieving an accuracy military planners have long desired. But the significant downside to PGMs is the unwritten part of the definition: half the munitions will not land in the CEP and the margin they will miss by is unquantifiable. In other words, they can land anywhere and still be described as a ‘precise’ weapon. Critics say this qualification lets politicians and the military off the ethical hook too easily and is a significant factor in the anger felt by those subject to such bombardment.
However, public anger towards drone strikes is as nothing compared to the outrage generated by the NSA and GCHQ cyber-spying revelations by Edward Snowden, the exiled former NSA contractor wanted for espionage in the US. But the real threat to the US and UK administrations from these allegations of global-scale mis-use of technology is that it has united the political left and right: left because of the affront to civil liberties, right due to the perceived scant regard for Congressional or Parliamentary oversight. Politicians abhor ceding the agenda, particularly where national security is concerned. Gisela Stuart, a British MP and member of the House of Commons Defence Select Committee, is all for the ethical use of lethal technology – she led a recent Parliamentary debate into drones – but is forthright in her defence of its ultimate purpose: “At the end of the day I don’t want you to give me a hanky to wipe away the blood, I’d prefer you to make sure it’s not there in the first place.” The arms race between ethics and technology looks set to continue for some time yet.