Academic journal article Research-Technology Management

The Ethics and Law of Robots

Academic journal article Research-Technology Management

The Ethics and Law of Robots

Article excerpt

A driverless car makes its way through a parking lot, when a shopping cart from one direction and a baby carriage from another roll in front of the car at the same time. A human driver would likely steer away from the baby carriage and toward the shopping cart to avoid hurting the baby in the stroller. The person recognizes the baby carriage as an object deserving greater care than the shopping cart. Would an autonomous system make that judgment or simply detect two objects of roughly the same size and shape?

This is not a question for the future. Google's driverless car has been in development since 2005, and the company plans to market the technology to the automotive industry. In the United States, at least four states (Nevada, Texas, Florida, and California) have approved rules that permit driverless cars under certain conditions. Technologically, experts agree, driverless cars could be available to consumers by 2020, but legal and regulatory frameworks aren't likely to move that fast.

Even today, modern society increasingly relies on autonomous systems to replace human interactions and streamline transactions. Automated stock trading, for instance, uses algorithms to execute pre-programmed "robot" trading instructions--more than half of all equity trading in the EU and the United States is done this way. In May 2010, the Dow Jones Industrial Average suffered its second largest point swing when a mutual fund company entered an algorithmic trade that triggered a chain reaction as different robot traders interacted, resulting in the 2010 Flash Crash. The 1,000-point plunge recovered in minutes, but it shocked market watchers and prompted a five-month investigation that included analysis of the algorithm trading used at the heart of the crash.

As the role of artificial intelligence in human lives expands, occurrences like this are bound to increase. Questions about the legal and ethical implications of robotic systems are becoming increasingly urgent.

In a recent interview, Guy Fraker, a former State Farm insurance executive and the cofounder and CEO of Autonomous Stuff, used the driverless car case to illustrate a simple point about the advancing technology of robotics: laws that regulate liability and insurance are not equipped to address the unique situations arising from the spread of autonomous systems. "The more we task robotics to act on our behalf," Fraker said in a recent issue of Communications of the ACM, "one of the first questions is, 'who is responsible' in the moment of truth.... We don't have an answer for that yet." That problem was highlighted when one of Google's cars was involved in an accident in 2011. Although the autonomous vehicle was not at fault in this instance, the incident did raise questions about how a machine could be held legally liable.

As law professor Neil Richards and computer science professor William Smart, both of Washington University in St. Louis, point out in a recent paper, the issue of liability is particularly fraught: "As robots become more autonomous, the question of where liability rests when something goes wrong is complicated. Is it the manufacturer, the programmer, the user (who gave a bad instruction), or some combination of them all?" They suggest that the questions become even more complex with systems that are autonomous some of the time and tele-operated by a human at other times.

As autonomous intelligent systems proliferate, other legal questions arise. Who is responsible for losses arising from automatic stock trading based on algorithms? Who owns the copyright on sports stories written by a computer? What right do people have to know when they're interacting with an automated hot on social media, as opposed to a real person? …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.