Coexistence of humans and robots is a keyword of the robot technology for increasing the quality of life. The explicit and implicit coexistence are both important; e.g. a cooperative work with the robot is explicit and an imperceptive work without the human's seeing is implicit. In this demonstration, a human following a robot with sneaking action is realized. The robot tracks and follows a person by using a 2D LiDAR. While the person stops, the robot hides in a blind area from the person. This algorithm might be useful for cleaning robots without disturbing the people and security robots to track a suspicious person.
For autonomous mobile robots, it is an serious problem to remove an obstacle which blocks the road. In this demonstration, the robot examines an obstacle whether it can be moved or not. When it is movable, the robot removes it from the way.
This robot navigates users to their destination just like a guide dog, but it is not particularly designed for the blind. It is rather for people who want to be free from being busy planning their path to the goal in their mind. Transit time is one of the biggest wasteful thing in the world (At least in my opinion). This system helps the users spend more productive time - such as talking with other people and thinking about nice idea - while just walking to their destination. In order to use the system, users need to register the location of their destination and the time they want to get there on Google Calendar. When the time comes, the robot urges its owner to start moving. The users put their smartphone in the handlebar mounted on the robot, with a dedicated App activated. When the handlebar is tilted forward, the robot moves forward to the destination. The robot avoids its obstacles automatically in their way. And thus the users are safely navigated to the place without realizing.
For autonomous navigation, the mobile robot needs to be given the map of the environment and to be instructed the route. This problem makes difficult to set up a autonomous navigation robot quickly. But, I want to demonstrate a autonomous navigation robot anytime and anywhere, because I am a mobile robot researcher. This demonstration shows the mapping and the route instruction within 5 minutes, and finally robot move the route automatically. In this demonstration, 1st step is that robot move manually for mapping, 2nd step is to edit the autonomous navigation route, 3rd step is the autonomous navigation. The map use the observed probability of laser scan points for robust localization and the map is built automatically while robot move manually. The route edit do easy by GUI route editor. In autonomous navigation, the robot moves on the designated route and estimate self position by matching the map and laser scanner readings.
This robot wander with avoiding obstacles. Robot speed up when a chaser come close from back. I used a web camera and laser scanner as external sensors. Robot determines the route by the potential method to measure the distance to surrounding objects in laser scanner. Robot gets the behind image from the web camera and determines the translational velocity by the estimated distance from the chaser. I wrote only 300 lines myself for this program but it work fine. If you like robot, maybe you like its program also.