AC Grayling: What happens when our military machines are not only unmanned but autonomous?

AC Grayling in Prospect:

D3P2G4_webWar, then, has changed in dramatic respects, technologically and, consequentially, in character too. But in other fundamental respects it is as it ever was: people killing other people. As Theodor Adorno said, thinking of the development of the spear into the guided missile: “We humans have grown cleverer over time, but not wiser.” Every step of this evolution has raised its own ethical questions, but the next twist in the long story of war could very well be autonomous machines killing people—something that could well necessitate a more profound rethink than any that has been required before.

As well as posing their own particular ethical problems, past advances in military technology have—very often—inspired attempts at an ethical solution too. The 1868 Declaration of St Petersburg outlawed newly-invented bullets that split apart inside a victim. The 1899 Hague Conference outlawed aerial bombardment, even before heavier-than-air flight had become possible—it had in mind the throwing of grenades from balloons. After the First World War, chemical weapons were outlawed and following the Second World War much energy was devoted to attempts at banning or limiting the spread of nuclear weapons. When Bashar al-Assad gassed his own people in Syria, President Donald Trump enforced the world’s red line with an airstrike.

So, just as the continuing evolution of the technology of combat is nothing new, nor is the attempt to regulate its grim advance. But such attempts to limit the threatened harm have often proved to be futile. For throughout history, it is technology that has made the chief difference between winning and losing in war—the spear and the atom bomb both represent deadly inventiveness prompted by emergency and danger. Whoever has possessed the superior technology has tended to prevail, which—if it then falls to the victors to enforce the rules—points to some obvious dilemmas and difficulties.

More here.