Subscribe to our newsletter

istockphoto/koto_feja

“We are facing an increasing danger of a potential arms race”

Stephanie Liechtenstein
Interviews 25 March 2019

In this interview, Dr. Ulrike Franke explains the danger behind lethal autonomous weapons systems, or so-called ‘killer robots’, and warns of a potential arms race.

The interview was conducted on the margins of the conference on “Capturing Technology: Rethinking Arms Control”, hosted by the German Foreign Office in Berlin on 15 March.

Franke is a policy fellow at the European Council on Foreign Relations (ECFR) and part of ECFR’s New European Security Initiative. Her areas of focus include German and European security and defence, the future of warfare, and the impact of new technologies such as drones and artificial intelligence. She has published widely on these topics.

 

Why do you think that Germany took the initiative to organise this high-level expert conference on new technologies and arms control?

I think that Germany wants to send a signal that arms control is an important topic, particularly in light of the U.S. withdrawal from arms control agreements and a worldwide, increasing focus on military build-up. I am particularly interested in discussing the specific challenges with regard to new technologies.

You are an expert on drones and lethal autonomous weapons systems. Can you briefly explain these types of weapons?

Drones are unmanned systems, that can be remotely piloted, fly along a pre-programmed path or which can also be autonomous. There is an important debate going on about the exact definition of lethal autonomous weapons systems, or ‘killer robots’, as they are sometimes called. I would say that these are weapons able to carry out a military full targeting cycle in a military operation without human intervention.

They are systems equipped with sensors, which might rely on artificial intelligence to take decisions fast and without human involvement. Weapons already exist that are highly automated, for example missile-defence systems that detect an incoming missile and then shoots down the missile, without human intervention. Even more autonomous are drone systems currently in development that can select, and, at least theoretically, engage targets without human intervention.

Why are these types of weapons so dangerous?

Politically, I am particularly concerned with two dangers associated with lethal autonomous weapons.

First, I am worried about potential “flash wars”. Similar to the financial sector, where we have been observing flash crashes for some time, one can imagine that autonomous weapons clash with each other, misinterpreting the signals they are getting, leading to an escalation that spirals out of control and thus could lead to a flash war within a few seconds.

Second, I think we are facing an increasing danger of a potential arms race. If one state develops a particular autonomous weapon system, other states may decide to follow suit because they fear that otherwise they will not be able to defend themselves accordingly. This escalatory logic of an arms race is not only dangerous but also very costly.

But maybe most important are ethical questions about whether we, as a society, are willing to delegate decisions over life and death to machines.

How far are we with regard to actually developing such autonomous weapons?

Fully autonomous, lethal ‘killer robots’ are not yet available on the market to buy. But a lot of research is being done, most notably in the area of artificial intelligence (AI), which can feed into the development of lethal autonomous weapon systems. Several states are looking into how to use AI, and autonomy, in warfare.

What do you think should be the main points to be included in a potential future international agreement on autonomous weapons?

I think first we have to agree on the elements that we want to avoid or prohibit. We have to find answers to the following questions: Do we want to contain the proliferation of certain systems, do we need to decide on common rules for their use, or do we have to prohibit them altogether? A common understanding on these points could help to make progress on an international agreement, for example within the UN framework in Geneva.

 

To learn more about this topic, read our article on the Berlin conference on “Capturing Technology: Rethinking Arms Control”.