Skip to main content

Darn You, R2! When Do We Blame Robots?

photo of a robot
Photo credit: iwouldificould. Shared under a Creative Commons license.

For Immediate Release

A recent study from North Carolina State University finds that people are likely to blame robots for workplace accidents, but only if they believe the robots are autonomous.

“Robots are an increasingly common feature in the workplace, and it’s important for us to understand how people view robots in that context – including how people view robots when accidents occur at work,” says Doug Gillan, a professor of psychology at NC State and corresponding author of a paper on the work.

To explore this issue, researchers conducted a study where 164 people were shown several workplace scenarios in which an accident occurred involving both a human and a robot.

When told the human was operating the robot, study participants usually blamed the human for the accident. When told the robot was autonomous, and that the human was only monitoring it, study participants usually blamed the robot.

“The finding is somewhat intuitive, but it addresses a fundamental issue: when do we transfer responsibility for an error from a human to a robot?” Gillen says.

“The study also raises questions about how quickly autonomous robots may be assimilated into the workplace. Do employers want to buy robots that may be more efficient, but can be blamed for errors – making it more difficult to hold human employees accountable? Or do employers want to stick to robots that are viewed solely as tools to be controlled by humans?”

The paper, “Attributing Blame to Robots: The Influence of Robot Autonomy,” is published in the journal Human Factors. The paper was an equal effort between Gillan, Caleb Furlough and Thomas Stokes. Furlough and Stokes are former Ph.D. students at NC State.

-shipman-

Note to Editors: The study abstract follows.

“Attributing Blame to Robots: The Influence of Robot Autonomy”

Authors: Caleb Furlough, Thomas Stokes, and Douglas J. Gillan, North Carolina State University

Published: Oct. 15, Human Factors

DOI: 10.1177/0018720819880641

Abstract:
Objective: The research examined how humans attribute blame to humans, non- autonomous robots, autonomous robots or environmental factors for scenarios in which errors occur.
Background: When robots and humans serve on teams, human perception of their technological team members can be a critical component of successful cooperation, especially when task completion fails.
Methods: Participants read a set of scenarios that described human-robot team task failures. Separate scenarios were written to emphasize the role of the human, the robot, or environmental factors in producing the task failure. After reading each scenario, the participants allocated blame for the failure among the human, robot, and environmental factors.
Results: In general, the order of amount of blame was humans, robots, and environmental factors. If the scenario described the robot as non-autonomous, the participants attributed almost as little blame to them as to the environmental factors; in contrast, if the scenario described the robot as autonomous, the participants attributed almost as much blame to them as to the human.
Conclusion: We suggest that humans use a hierarchy of blame in which robots are seen as partial social actors, with the degree to which people view them as social actors depending on the degree of autonomy.
Application: The acceptance of robots by human co-workers will be a function of the attribution of blame when errors occur in the workplace. The present research suggests that greater autonomy for the robot will result in greater attribution of blame in work tasks.
Precis: Does robot autonomy affect attribution of blame for errors? Participants read scenarios emphasizing a human, a robot (nonautonomous or autonomous), or environmental factors in producing errors. Participants’ blame for nonautonomous robots was similar to that for environmental factors, but blame for autonomous robots was similar to that for humans.