Who Pays When Drones Crash?


Henry H. Perritt, Jr.

In his 1942 short story, “Runaround,” Isaac Asimov set forth three “laws” for robots:

  1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Later, Asimov introduced a more basic law, sometimes numbered fourth, sometimes zeroth, which outranked the others:

  1. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

Asimov was addressing the terms under which robots might participate in everyday life. His purpose was to entertain by imagining a future that did not exist.

Now, Asimov’s future exists in small robots’ colloquially called “drones” buzzing about by the hundreds of thousands and automobile manufacturers’ jockeying for who will be the first to market a driverless car. American railroads have reluctantly committed $ 7 billion to automate the control of railroad trains.

These automated systems have caused few accidents to date, but more will come. When they do, the courts will have to sort out who pays for the cost of the accident. The law has worked out a detailed set of doctrines to adjudicate products liability, and its basic outlines are clear: an actor is liable for the injuries caused by its defective products. No longer is an accident likely to have resulted from the separation of a massive connecting rod on a 300-ton steam locomotive. Now, an accident is more likely to occur because of a glitch in the execution of computer code in an integrated circuit chip about the size of a fingernail. Design defects are less likely to involve the collapse of a bridge, and more like to involve the flyaway of a 2-pound drone into parts unknown, as it is escapes control by its master.

This article argues that lawyers, policymakers, and entrepreneurs must readjust their thinking to focus on new ways in which errant technology causes injury. The functioning of new technologies in the real world is inherently uncertain and unpredictable. Post-sale technical support of products play an important role in ensuring safe operation. Vendors should face the consequences of inadequate technical support because their product cannot be operated safely without continuing and competent support.

The article recognizes that a policy bargain must be struck: law should get out of the way so that society gets the benefit of new robots without waiting for regulators to guess the future. The law should also hold designers and vendors accountable for supporting new uses and covering the cost of accidents. Sale does not relieve manufacturers and distributors of their legal duty.

Download (PDF, 670KB)

April 22, 2017 PDF Articles, Volume 21, Volume 21

Comments are closed.