Autonomous (self-driving) vehicles are being increasingly tested on highways and city streets. But there is also a need for robots that can navigate through environments like sidewalks, buildings, and hallways. In these situations, the robots must interact and cooperate with pedestrians in a socially acceptable manner. The “rules of the road” no longer apply – there are no lanes or street signs, and pedestrians don’t use turn signals when cutting through crowds. The way people walk even changes from city to city.
Our first work presented a collision avoidance algorithm using Deep RL, where agents learn to avoid collisions with other dynamic agents by training in a simulated environment. This led to more natural behavior, agents reaching their goals faster, and a demonstration of the algorithm with real robots in our lab. This work was awarded as a Finalist for Best Paper in Multi-Robot Systems at ICRA 2017.
A continuation of this work extended the algorithm to learn social norms that pedestrians tend to follow, such as passing on the right side. The contributions included a Socially Aware collision avoidance algorithm and better handling of multi-agent scenarios. The algorithmic improvements enabled a robot to operate among pedestrians in a public environment (Building 32) for periods of over 20 mins without human intervention. This work was awarded Best Student Paper and Finalist for Best Paper in Cognitive Robotics at IROS 2017.
Our most recent paper further improved the handling of large numbers of nearby agents (n > 4) using LSTM and fewer assumptions about other agents’ behaviors. This work was presented at IROS 2018.
The robot is often tested in MIT’s Building 32 during peak traffic at lunchtime. Some applications of this technology could be in delivery of goods, mobility on demand (MOD) systems, and human assistance scenarios.
Software for a recent paper is published at Github.