“Social Loafing” in Human-Robot Teams

Does social loafing, defined as reduced individual effort on a task performed in a team compared to a task performed alone, occur in human–robot teams?

Researchers from the Technical University of Berlin explored that question in a recent study. Here’s the introduction to their research paper published in October 2023 in Frontiers in Robotics and AI:

Thanks to technological advances, robots are now being used for a wide range of tasks in the workplace. They are often introduced as team partners to assist workers. This teaming is typically associated with positive effects on work performance and outcomes. However, little is known about whether typical performance-reducing effects that occur in human teams also occur in human–robot teams. For example, it is not clear whether social loafing, defined as reduced individual effort on a task performed in a team compared to a task performed alone, can also occur in human–robot teams.

According to the paper, the team “investigated this question in an experimental study in which participants worked on an industrial defect inspection task that required them to search for manufacturing defects on circuit boards. One group of participants worked on the task alone, while the other group worked with a robot team partner, receiving boards that had already been inspected by the robot.”

What did the research reveal? 

Participants in both groups inspected almost the entire board surface, took their time searching, and rated their subjective effort as high. However, participants working in a team with the robot found on average 3.3 defects. People working alone found significantly more defects on these 5 occasions–an average of 4.2.

This suggests that participants may have searched the boards less attentively when working with a robot team partner.

This “social loafing” phenomenon has been observed in human teams too, so it’s not surprising to see it happen with human-robot teams. If you already know that someone (whether a human or robot) has already reviewed or inspected something, you’re probably going to assume it was done correctly and put less effort into the task versus if you were the only one responsible for the job.

For me, this all boils down to trust.

Some people don’t trust technology — that is, they don’t believe it can produce better results than the “way we’ve always done it” approach, so they cling to their Excel spreadsheets and traditional ways of working. Lack of trust is one of the challenges at the moment with AI and enabling a “fully autonomous” supply chain.

Other people trust technology too much, a problem I wrote about in “Is Software Making Us Dumb?” and “Trust Yourself More Than a Computer.” Simply put, they blindly follow whatever the technology does or tells them to do, even if their eyes, ears, and years of experience tells them otherwise.

Both of these extremes are a problem.

Therefore, it’s important for companies to recognize how trust in technology, either too much of it or too little, can impact the success of an implementation and the outcomes expected.

Maybe the Russian proverb made famous by Ronald Reagan applies here too: When it comes to robots, AI, and other technologies, you should “Trust, but verify.”