Imagine you’re in a cafeteria, finishing up a bag of chips and chatting with some friends. You’re beginning to think about getting up to throw away your wrapper, when—suddenly—the nearest trash barrel approaches you instead. It rolls back and forth, and wiggles briefly. It is, it seems, at your service.

How do you respond?

This situation may seem a bit low-stakes. But to a group of scientists at Stanford’s Center for Design Research, it’s an accurate representation of a very possible future. That’s why they’ve built themselves a robotic trash barrel, set it loose in a variety of lunchrooms, and run this very scenario over and over again.

“People used to think that artificial intelligences would be these humanlike incarnations, like HAL in 2001,” says David Sirkin, a research associate at the Center. Thus far, though, things haven’t gone that way: “AI nowadays is things like the automatic braking system in your car, or the elevator that knows where to go–these almost invisible pieces of everyday life,” he says. He thinks robotics is headed in the same direction.

article-image

And while other experts may be focused on how to make these machines work on their own, Sirkin and his coworkers are more concerned with what will happen when they finally do. “We’re trying to understand how people respond to moving, everyday objects,” says Sirkin.

To that end, the group has fast-forwarded past engineering-based concerns, and focuses instead on achieving a type of movie magic. They spend much of their time scouting locations and storyboarding possible human-robot scenarios. To bring those scenarios to life, they’ve created a series of what they call “expressive everyday objects”—furnishings and fixtures that move around and elicit responses from humans. 

article-image

These robots are not complicated—the barrel is “literally a trash can on top of a Roomba,” says Dr. Wendy Ju, an Executive Director at the Center. They’re not even autonomous robots: they’re controlled by team members who drive their movements from another room, a technique the group, which also includes student collaborators Brian Mok and Stephen Yang, calls “Wizard of Oz-ing.” On Ju’s Vimeo page, you can watch a mechanical ottoman cajole volunteers into propping up their feet, and a set of wiggling drawers “help” participants solve a puzzle. Other experiments have involved a hyperactive sofa and some hesitant automatic doors. It’s as though the lab adopted the whole Beauty and the Beast menagerie at a junk sale and retrained them as sociologists.

The trash barrel has delivered some particularly unique insights. First of all, Sirkin and Ju say, it highlights how good people are at subtly refusing to acknowledge interactions they don’t want or need—a behavior the team has dubbed “unteracting.” If the trash barrel approaches a table of people, and they have no trash to give it, they generally won’t shoo it off. They’ll just steadfastly ignore it until it rolls away again. “They’re using their gaze as a tool for deciding when they’re engaging or not,” says Ju. (You can see this about halfway through the video, when a man on a cell phone refuses to look at the barrel until it backs off.)

article-image

On the other hand, people who did make use of the barrel felt miffed when it didn’t respond more. “People kind of expected it to thank them,” says Sirkin. “They’ll say ‘I fed the robot, and it didn’t thank me, and that was insulting.’” Some would also whistle for it, or dangle trash in front of it enticingly.

To Sirkin, this indicates that people not only place the robot on a different social stratum than they would a human—you wouldn’t expect, say, a busboy to thank you for giving him your dirty dishes—they also see it as an autonomous being with its own set of needs. Specifically, “they feel that it intrinsically desires to consume trash,” Sirkin says.

article-image

Some human-robot interactions do bring out the best in people, though. The team will occasionally pilot the robot into a ditch, or get it stuck in a row of chairs, at which point bystanders will rush to its rescue. When it struggles, something melts the hearts of the formerly indifferent or needy humans around it. 

Ju thinks we respond so strongly because the robot “looks like it’s learning”–adjusting to its new intelligence, its expanding world, and its strangely communicative companions. “Apparently we have this inbuilt feeling where we’re supportive of that sort of thing,” Ju says. Hopefully any truly autonomous trash cans will feel similarly, and go easy on us in turn. As our new friend shows us, humans heading towards the robo-future also have a lot to learn.