Real-Time Appearance-Based Mapping for robot SLAM localization implemented with ROS. Implemented together with acml location and move_base planing.
After robot has visit all pasible allowed areas in the world, the generated robot rtab-map look like this: the gereated map can be download for here. (!!332MB)
Image shows that 5 closed loops where achieve.(G=5; in down-left side of screen).
Dependenties packages must be installed in your computer, if your using Lubuntu you can install as:
for other installation system please check oficial instructions
For running the code you need to build the project as :
./catkin_make
./source devel/setup.bash
./roslaunch my_robot world.launch
By default the robot will be launch with a static map loaded and able to locate itself using this map. Also, by default the robot has path planning activated. to make the robot actually use the RTAB-map capability, you need to run the robot in mapping or location state as:
Mapping state makes the robot generate a new map each time the code is run. to launch the robot in a mapping state run the following:
roslaunch my_robot world.launch
roslaunch my_robot mapping.launch
Location states allows the robot to use the previous generated map to locate itself and move arrond the world. To launch the robot in a location state run the following:
roslaunch my_robot world.launch
roslaunch my_robot location.launch
used packages:
- RTAB-Map ⋅⋅⋅2D Laser sensor_msgs/LaserScan messages ⋅⋅⋅Odometry sensors, providing nav_msgs/Odometry messages ⋅⋅⋅3D Camera, compatible with openni_launch, openni2_launch or freenect_launch ROS packages
- Udacity Robotics Software Engineer Nanodegree Program