Socially Acceptable Navigation



Project Description  (top)


Socially Acceptable Navigation by Steven Chen, Michael Everett, and Justin Miller


Autonomous (self-driving) vehicles are being increasingly tested on highways and city streets. But there is also a need for robots that can navigate through environments like sidewalks, buildings, and hallways. In these situations, the robots must interact and cooperate with pedestrians in a socially acceptable manner. The "rules of the road" no longer apply -- there are no lanes or street signs, and pedestrians don't use turn signals when cutting through crowds. The way people walk even changes from city to city!

This project focuses on the perception and path planning of a robot that can handle these environments. The robot uses sensors such as LiDAR and cameras to determine

  • where the robot is on a map
  • what objects are pedestrians
  • where those pedestrians intend to walk
  • what path will get the robot to its goal, and be socially acceptable

The robot is often tested in MIT's Building 32 during peak traffic at lunchtime. Some applications of this technology could be in delivery of goods, mobility on demand (MOD) systems, and human assistance scenarios. Future work considers novel path planning strategies and the utility of new sensors for this application.

 





Videos  (top)



Related publications  (top)

Conference Papers