Designing for responsibility: five desiderata of military robots
Designing for responsibility: five desiderata of military robots
Samenvatting
Recently, the use of military robots – which may be drones, unmanned aerial vehicles (UAVs), remotely piloted systems, autonomous weapon systems or ‘killer robots’ – has been debated in the media, politics, and academia. Military robots are increasingly automated, which means that they can perform tasks with decreased human involvement. On the one hand, this may lead to faster and better outcomes, but on the other hand, it raises the concern ‘Who is responsible for the (failed) actions of military robots?’. The issue becomes particularly stringent in the prospect of a future in which armies may deploy military robots that apply lethal force without human interference. In this abstract, we approach the responsibility question from an engineering perspective and suggest a solution that lies in the design of military robots. First, we would like to make a distinction between legal and moral responsibility. Legally, the person or organization deploying military robots, i.e. the army here, is responsible for their behavior, rather than the designer, programmer, manufacturer or the robot itself. The army’s legal responsibility, however, does not imply that it is in the position to take moral responsibility. In accordance with the Value Sensitive Design approach, we argue that the way technology is designed affects moral responsibility. For instance, most people will agree that in principle the person firing a gun, and not the manufacturer or the gun itself, should be held responsible for the consequences of a shot. In this case, the gun’s design supports moral responsibility. Acting responsible is harder, however, when you rely on a decision support system that is incomprehensible, or when you have to use a weapon that may fire accidentally. In these examples, the system’s design hinders moral responsibility. A gap between moral and legal responsibility is undesired. We, therefore, argue that military robots should be designed such that the army is in the position to take moral responsibility for the behavior of military robots. In other words, we have to design for responsibility.
Organisatie | Hogeschool Rotterdam |
Lectoraat | Kenniscentrum Creating 010 |
Datum | 2014-05-21 |
Type | Conferentiebijdrage |
Taal | Engels |