Size: 1329
Comment:
|
Size: 24424
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
<<PackageHeader(rtabmap)>> <<TOC(4)>> |
## repository: https://code.ros.org/svn/ros-pkg <<PackageHeader(rtabmap)>> <<TOC(3)>> == Overview == ''' Under construction!!!! moving Wiki from [[https://code.google.com/p/rtabmap/wiki/ROS|googlecode wiki]]''' |
Line 10: | Line 15: |
'''Web page''': [[https://code.google.com/p/rtabmap]] '''Source''': [[http://rtabmap.googlecode.com/svn/trunk/ros-pkg]] '''Dependencies''': RTAB-Map's standalone library ([[http://rtabmap.googlecode.com/svn/trunk/rtabmap]]) '''Description''': RTAB-Map (Real-Time Appearance-Based Mapping) is a RGB-D Graph SLAM approach based on a global Bayesian loop closure detector. The loop closure detector uses a bag-of-words approach to determinate how likely a new image comes from a previous location or a new location. When a loop closure hypothesis is accepted, a new constraint is added to the map's graph, then a graph optimizer minimizes the errors in the map. A memory management approach is used to limit the number of locations used for loop closure detection and graph optimization, so that real-time constraints on large-scale environnements are always respected. RTAB-Map can be used alone with a hand-held Kinect or stereo camera for 6DoF RGB-D mapping, or on a robot equipped with a laser rangefinder for 3DoF mapping. See [[https://code.google.com/p/rtabmap/wiki/ROS]] for more details. |
'''Web page''': http://introlab.github.io/rtabmap '''Source''': https://github.com/introlab/rtabmap-ros-pkg '''Dependencies''': RTAB-Map's standalone library (https://github.com/introlab/rtabmap) '''Description''': RTAB-Map (Real-Time Appearance-Based Mapping) is a RGB-D Graph SLAM approach based on a global Bayesian loop closure detector. The loop closure detector uses a bag-of-words approach to determinate how likely a new image comes from a previous location or a new location. When a loop closure hypothesis is accepted, a new constraint is added to the map's graph, then a graph optimizer minimizes the errors in the map. A memory management approach is used to limit the number of locations used for loop closure detection and graph optimization, so that real-time constraints on large-scale environnements are always respected. RTAB-Map can be used alone with a hand-held Kinect or stereo camera for 6DoF RGB-D mapping, or on a robot equipped with a laser rangefinder for 3DoF mapping. |
Line 19: | Line 24: |
== Tutorials == <<FullSearchWithDescriptionsCS(title:"rtabmap/Tutorials/")>> == Demos == === Robot mapping === <<Youtube(c0qrEd5rR7M)>> For this demo, you will need the ROS bag demo_mapping.bag : * [[http://rtabmap.googlecode.com/files/demo_mapping.part1.rar|demo_mapping.part1.rar]] * [[http://rtabmap.googlecode.com/files/demo_mapping.part2.rar|demo_mapping.part2.rar]] Launch: [[http://rtabmap.googlecode.com/svn/trunk/ros-pkg/launch/demo/demo_robot_mapping.launch|demo_robot_mapping.launch]] {{{ $ roslaunch rtabmap demo_robot_mapping.launch $ rosbag play --clock demo_mapping.bag }}} === Robot mapping with RVIZ === <<Youtube(MQoSDpAsqps)>> For this demo, you will also need the ROS bag demo_mapping.bag : * [[http://rtabmap.googlecode.com/files/demo_mapping.part1.rar|demo_mapping.part1.rar]] * [[http://rtabmap.googlecode.com/files/demo_mapping.part2.rar|demo_mapping.part2.rar]] Launch: [[http://rtabmap.googlecode.com/svn/trunk/ros-pkg/launch/demo/demo_robot_mapping_rviz.launch|demo_robot_mapping_rviz.launch]] {{{ $ roslaunch rtabmap demo_robot_mapping_rviz.launch $ rosbag play --clock demo_mapping.bag }}} === Multi-session mapping === <<Youtube(XrnyhaxPCro)>> For this demo, you will need the ROS bags of five mapping sessions : * [[https://drive.google.com/uc?export=download&confirm=wwQS&id=0B46akLGdg-uaa1dDSlUwWUsyTzQ|map1.bag.zip]] * [[https://drive.google.com/uc?export=download&confirm=nSzj&id=0B46akLGdg-uaUDNKT0RFYl90ejQ|map2.bag.zip]] * [[https://drive.google.com/uc?export=download&confirm=m6u1&id=0B46akLGdg-uaUzhpRTFfSW1nRk0|map3.bag.zip]] * [[https://drive.google.com/uc?export=download&confirm=RNlz&id=0B46akLGdg-uaQ0doVVFKYTk1SGM|map4.bag.zip]] * [[https://drive.google.com/uc?export=download&confirm=QDwm&id=0B46akLGdg-uaY2FDQTZ2dzNrMTA|map5.bag.zip]] For the first launch, you can do "Edit->Delete memory" to make sure that you start from a clean memory. Launch: [[http://rtabmap.googlecode.com/svn/trunk/ros-pkg/launch/demo/demo_multi-session_mapping.launch|demo_multi-session_mapping.launch]] {{{ $ roslaunch rtabmap demo_multi-session_mapping.launch $ rosbag play --clock map1.bag $ (...) $ rosbag play --clock map2.bag $ (...) $ rosbag play --clock map3.bag $ (...) $ rosbag play --clock map4.bag $ (...) $ rosbag play --clock map5.bag }}} === Robot mapping with Find-Object === <<Youtube(o1GSQanY-Do)>> Find-Object's ros-pkg [[https://code.google.com/p/find-object/#ROS|find_object_2d]] should be installed. ROS Bag: [[https://docs.google.com/uc?export=download&confirm=XJb5&id=0B46akLGdg-uaUEVEX1VmQTdOQUE|demo_find_object.bag.zip]] Launch: [[http://rtabmap.googlecode.com/svn/trunk/ros-pkg/launch/demo/demo_find_object.launch|demo_find_object.launch]] {{{ $ roslaunch rtabmap demo_find_object.launch $ rosbag play --clock demo_find_object.bag }}} === Stereo === <<Youtube(qpTS7kg9J3A)>> Visit the page StereoOutdoorDemo more detailed information. It is also shown how to create 2D occupancy grid map for navigation. Launch : [[https://rtabmap.googlecode.com/svn/trunk/ros-pkg/launch/demo/demo_stereo_outdoor.launch|demo_stereo_outdoor.launch]] {{{ $ roslaunch rtabmap demo_stereo_outdoor.launch $ rosbag play --clock stereo_outdoorA.bag [...] $ rosbag play --clock stereo_outdoorB.bag }}} === Appearance-based loop closure detection-only === <<Youtube(LlcqK5HJeQU)>> Launch: [[http://rtabmap.googlecode.com/svn/trunk/ros-pkg/launch/demo/demo_appearance_localization.launch|demo_appearance_mapping.launch]] {{{ $ roslaunch rtabmap demo_appearance_mapping.launch }}} The GUI shows a plenty of information about the loop closures detected. If you only need the ID of the matched past image of a loop closure, you can do that: {{{ $ rostopic echo /rtabmap/info/loopClosureId 6 --- 0 --- 7 --- }}} A "0" means no loop closure detected. This can be also used in localization mode: {{{ $ roslaunch rtabmap demo_appearance_localization.launch }}} == Nodes == === rtabmap === This is the main node of this package. It is a wrapper of the RTAB-Map library. This is where the graph of the map is incrementally built and optimized when a loop closure is detected. The online output of the node is the local graph with the latest added data to the map (as well with RTAB-Map statistics). The default localization of the RTAB-Map database is "~/.ros/rtabmap.db" and the workspace is also set to "~/.ros" (use RTAB-Map parameters '''Rtabmap/DatabasePath''' and '''Rtabmap/WorkingDirectory''' to change the defaults). Argument "--delete_db_on_start" will make rtabmap to delete the database before starting, otherwise the previous mapping session is loaded. There are two set of parameters: ROS and RTAB-Map parameters. The ROS parameters are for connection stuff to interface the RTAB-Map library with ROS. The RTAB-Map parameters are those from the RTAB-Map library. One way to know RTAB-Map library parameters is to look at this file : [[http://code.google.com/p/rtabmap/source/browse/trunk/rtabmap/corelib/include/rtabmap/core/Parameters.h|Parameters.h]]. There is a description for each parameter. Here an example (use string type for all RTAB-Map parameters): {{{ <launch> <node name="rtabmap" pkg="rtabmap" type="rtabmap"> <param name="RGBD/ToroIterations" type="string" value="50"/> </node> </launch> }}} Another way to show available parameters from the terminal is to call the node with "--params" argument: {{{ $ rosrun rtabmap rtabmap --params }}} {{{ #!clearsilver CS/NodeAPI sub { 0.name = odom 0.type = nav_msgs/Odometry 0.desc = Odometry stream. Required if parameters '''subscribe_depth''' or '''subscribe_stereo''' are true. 1.name = rgb/image 1.type = sensor_msgs/Image 1.desc = RGB/Mono rectified image. 2.name = rgb/camera_info 2.type = sensor_msgs/CameraInfo 2.desc = RGB camera metadata. 3.name = depth/image 3.type = sensor_msgs/Image 3.desc = Registered depth image. 4.name = scan 4.type = sensor_msgs/LaserScan 4.desc = Laser scan stream. Required if parameter '''subscribe_laserScan''' is true. } pub { 0.name = info 0.type = rtabmap/Info 0.desc = RTAB-Map's basic info. 1.name = infoEx 1.type = rtabmap/InfoEx 1.desc = RTAB-Map's extended info. 2.name = mapData 2.type = rtabmap/MapData 2.desc = RTAB-Map's graph and latest node data. } srv { 0.name = get_map 0.type = rtabmap/GetMap 0.desc = Call this service to get the map data 1.name = update_parameters 1.type = std_srvs/Empty 1.desc = The node will update with current parameters of the rosparam server 2.name = reset 2.type = std_srvs/Empty 2.desc = Delete the map 3.name = pause 3.type = std_srvs/Empty 3.desc = Pause mapping 4.name = resume 4.type = std_srvs/Empty 4.desc = Resume mapping 5.name = trigger_new_map 5.type = std_srvs/Empty 5.desc = The node will begin a new map 6.name = set_mode_localization 6.type = std_srvs/Empty 6.desc = Set localization mode 7.name = set_mode_mapping 7.type = std_srvs/Empty 7.desc = Set mapping mode } param { 0.name = ~subscribe_depth 0.type = bool 0.default = `"true"` 0.desc = Subscribe to depth image 1.name = ~subscribe_laserScan 1.type = bool 1.default = `"false"` 1.desc = Subscribe to laser scan 2.name = ~subscribe_stereo 2.type = bool 2.default = `"false"` 2.desc = Subscribe to stereo images 3.name = ~frame_id 3.type = string 3.default = `"base_link"` 3.desc = The frame attached to the mobile base. 4.name = ~map_frame_id 4.type = string 4.default = `"map"` 4.desc = The frame attached to the map. 5.name = ~queue_size 5.type = int 5.default = 10 5.desc = Size of message queue for each synchronized topic. 6.name = ~tf_delay 6.type = double 6.default = 0.05 6.desc = Rate at which the TF from /map to /odom is published (20 Hz). 7.name = ~rgb/image_transport 7.type = string 7.desc = Image transport format. 7.default = "raw" 8.name = ~depth/image_transport 8.type = string 8.desc = Depth image transport format. 8.default = "raw" } req_tf { 0.from = base_link 0.to = <the frame attached to sensors of incoming data> 0.desc = usually a fixed value, broadcast periodically by a [[robot_state_publisher]], or a `tf` [[tf#static_transform_publisher|static_transform_publisher]]. 1.from = odom 1.to = base_link 1.desc = usually provided by the odometry system (e.g., the driver for the mobile base). } prov_tf { 0.from = map 0.to = odom 0.desc = the current estimate of the robot's pose within the map frame. } }}} === rtabmapviz === This node starts the visualization interface of RTAB-Map. It is a wrapper of the RTAB-Map GUI library. It has the same purpose as [[rviz]] but with specific options for RTAB-Map. ==== Arguments ==== * *-d "rtabmap.ini"* * Set a RTAB-Map ini file for GUI interface parameters. {{{ #!clearsilver CS/NodeAPI sub { 0.name = odom 0.type = nav_msgs/Odometry 0.desc = Odometry stream. Required if parameters '''subscribe_depth''', '''subscribe_laserScan''' or '''subscribe_stereo''' are true. 1.name = rgb/image 1.type = sensor_msgs/Image 1.desc = RGB/Mono rectified image. 2.name = rgb/camera_info 2.type = sensor_msgs/CameraInfo 2.desc = RGB camera metadata. Required if parameter '''subscribe_depth''' is true. 3.name = depth/image 3.type = sensor_msgs/Image 3.desc = Registered depth image. Required if parameter '''subscribe_depth''' is true. 4.name = scan 4.type = sensor_msgs/LaserScan 4.desc = Laser scan stream. Required if parameter '''subscribe_laserScan''' is true. 5.name = infoEx 5.type = rtabmap/InfoEx 5.desc = RTAB-Map's extended info. 6.name = mapData 6.type = rtabmap/MapData 6.desc = RTAB-Map's graph and latest node data. } pub { 0.name = info 0.type = rtabmap/Info 0.desc = RTAB-Map's basic info. 1.name = infoEx 1.type = rtabmap/InfoEx 1.desc = RTAB-Map's extended info. 2.name = mapData 2.type = rtabmap/MapData 2.desc = RGB camera metadata. } param { 0.name = ~subscribe_depth 0.type = bool 0.default = `"true"` 0.desc = Subscribe to depth image 1.name = ~subscribe_laserScan 1.type = bool 1.default = `"false"` 1.desc = Subscribe to laser scan 2.name = ~frame_id 2.type = string 2.default = `"base_link"` 2.desc = The frame attached to the mobile base. 3.name = ~queue_size 3.type = int 3.default = 10 3.desc = Size of message queue for each synchronized topic. 4.name = ~rgb/image_transport 4.type = string 4.desc = Image transport format. 4.default = "raw" 5.name = ~depth/image_transport 5.type = string 5.desc = Depth image transport format. 5.default = "raw" } req_tf { 0.from = base_link 0.to = <the frame attached to sensors of incoming data> 0.desc = usually a fixed value, broadcast periodically by a [[robot_state_publisher]], or a `tf` [[tf#static_transform_publisher|static_transform_publisher]]. 1.from = odom 1.to = base_link 1.desc = usually provided by the odometry system (e.g., the driver for the mobile base). 2.from = map 2.to = odom 2.desc = the current estimate of the robot's pose within the map frame. } }}} === rgbd_odometry === This node wraps the visual odometry approach of RTAB-Map. Using RGBD images, odometry is computed using FAST/SIFT/SURF features extracted from the RGB images with their depth information from the depth images. Using the feature correspondences between the images, a RANSAC approach compute the transformation between the consecutive images. The approach is visual based so enough visual features must be present to compute the odometry. When it fails to update the odometry, a null transformation is sent to notify the receiver that odometry is not updated. There are two set of parameters: ROS and RTAB-Map parameters. The ROS parameters are for connection stuff to interface the RTAB-Map library with ROS. The RTAB-Map parameters are those from the RTAB-Map library. See these examples in sample launch files: * [[http://code.google.com/p/rtabmap/source/browse/trunk/ros-pkg/launch/tests/test_odometry_SURF.launch|test_odometry_SURF.launch]] * [[http://code.google.com/p/rtabmap/source/browse/trunk/ros-pkg/launch/tests/test_odometry_SIFT.launch|test_odometry_SIFT.launch]] * [[http://code.google.com/p/rtabmap/source/browse/trunk/ros-pkg/launch/tests/test_odometry_FAST-BRIEF.launch|test_odometry_FAST-BRIEF.launch]] * [[http://code.google.com/p/rtabmap/source/browse/trunk/ros-pkg/launch/tests/test_odometry_FAST-FREAK.launch|test_odometry_FAST-FREAK.launch]] * [[http://code.google.com/p/rtabmap/source/browse/trunk/ros-pkg/launch/tests/test_odometry_ORB.launch|test_odometry_ORB.launch]] Parameters available for odometry can be shown from the terminal by using the "--params" argument: {{{ $ rosrun rtabmap visual_odometry --params }}} {{{ #!clearsilver CS/NodeAPI sub { 0.name = rgb/image 0.type = sensor_msgs/Image 0.desc = RGB/Mono rectified image. 1.name = rgb/camera_info 1.type = sensor_msgs/CameraInfo 1.desc = RGB camera metadata. 2.name = depth/image 2.type = sensor_msgs/Image 2.desc = Registered depth image. } pub { 0.name = odom 0.type = nav_msgs/Odometry 0.desc = Odometry stream. Send null odometry if cannot be computed. } srv { 0.name = reset_odom 0.type = std_srvs/Empty 0.desc = Restart odometry to identity transformation. } param { 0.name = ~subscribe_depth 0.type = bool 0.default = `"true"` 0.desc = Subscribe to depth image 1.name = ~subscribe_laserScan 1.type = bool 1.default = `"false"` 1.desc = Subscribe to laser scan 2.name = ~frame_id 2.type = string 2.default = `"base_link"` 2.desc = The frame attached to the mobile base. 3.name = ~queue_size 3.type = int 3.default = 10 3.desc = Size of message queue for each synchronized topic. 4.name = ~rgb/image_transport 4.type = string 4.desc = Image transport format. 4.default = "raw" 5.name = ~depth/image_transport 5.type = string 5.desc = Depth image transport format. 5.default = "raw" } req_tf { 0.from = base_link 0.to = <the frame attached to sensors of incoming data> 0.desc = usually a fixed value, broadcast periodically by a [[robot_state_publisher]], or a `tf` [[tf#static_transform_publisher|static_transform_publisher]]. } prov_tf { 0.from = odom 0.to = base_link 0.desc = the current estimate of the robot's pose within the odometry frame. } }}} === camera === A node for image acquisition from an USB camera (OpenCV is used). A special option for this node is that it can be configured to read images from a directory or a video file. Parameters can be changed with the *dynamic_reconfigure* GUI from ROS. ==== Output topics ==== * *image* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] ==== Parameters ==== * *frame_id* (string, default "camera") For dynamic parameters, see [[http://code.google.com/p/rtabmap/source/browse/trunk/ros-pkg/cfg/Camera.cfg|Camera.cfg]] === data_recorder === This node can be used to record data from an OpenNI camera to RTAB-Map database format (".db"), which can be read after by the standalone application (Detection->Select Source->Database). It is like a ROS bag, but timing information are not saved. ==== Input topics ==== * *odom* [[http://docs.ros.org/api/nav_msgs/html/msg/Odometry.html|nav_msgs/Odometry]] * required if parameters *subscribe_odometry* is true * *scan* [[http://docs.ros.org/api/sensor_msgs/html/msg/LaserScan.html|sensor_msgs/LaserScan]] * required if parameter *subscribe_laserScan* is true * *rgb/image* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *depth/image* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * required if parameter *subscribe_depth* is true * *rgb/camera_info* [[http://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html|sensor_msgs/CameraInfo]] * required if parameter *subscribe_depth* is true ==== Parameters ==== * *frame_id* (string, default "base_link") * *subscribe_odometry* (bool, default false) * *subscribe_depth* (bool, default false) * *subscribe_laserScan* (bool, default false) * *queue_size* (int, 10) * synchronization queue size * *output_file_name* (string, default "output.db") * *rgb/image_transport* (string, default "raw") * *depth/image_transport* (string, default "raw") === map_assembler === This node subscribes to *rtabmap* output topic *mapData* and assembles the 3D map incrementally, then publishes the assembled map. ==== Input topics ==== * *mapData* [[rtabmap_MapData|rtabmap/MapData]] ==== Output topics ==== * *assembled_clouds* [[http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html|sensor_msgs/PointCloud2]] * *assembled_scans* [[http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html|sensor_msgs/PointCloud2]] ==== Parameters ==== * *cloud_decimation* (int, default 4) * *cloud_max_depth* (double, default 4.0) * *cloud_voxel_size* (double, default 0.02) * *scan_voxel_size* (double, default 0.01) === grid_map_assembler === This node subscribes to *rtabmap* output topic *mapData* and assembles the map incrementally using the laser scans, then publishes the occupancy grid map. ==== Input topics ==== * *mapData* [[rtabmap_MapData|rtabmap/MapData]] ==== Output topics ==== * *grid_map* [[http://docs.ros.org/api/nav_msgs/html/msg/OccupancyGrid.html|nav_msgs/OccupancyGrid]] ==== Parameters ==== * *scan_voxel_size* (double, default 0.01) ==== Services ==== * *get_grid_map* [[http://docs.ros.org/api/nav_msgs/html/srv/GetMap.html|nav_msgs/GetMap]] == Nodelets == {{{ #!clearsilver CS/NodeAPI node { name = rtabmap/data_odom_sync desc = Synchronize odometry with RGB-D images. Useful to correctly show clouds in RVIZ when odometry refresh rate is low comparatively to clouds to show. sub { 0.name = odom_in 0.type = nav_msgs/Odometry 0.desc = Odometry stream. 1.name = rgb/image_in 1.type = sensor_msgs/Image 1.desc = RGB image stream. 2.name = depth/image_in 2.type = sensor_msgs/Image 2.desc = Registered depth image stream. 3.name = rgb/camera_info_in 3.type = sensor_msgs/CameraInfo 3.desc = RGB camera metadata. } pub { 0.name = odom_out 0.type = nav_msgs/Odometry 0.desc = Odometry stream. 1.name = rgb/image_out 1.type = sensor_msgs/Image 1.desc = RGB image stream. 2.name = depth/image_out 2.type = sensor_msgs/Image 2.desc = Registered depth image stream. 3.name = rgb/camera_info_out 3.type = sensor_msgs/CameraInfo 3.desc = RGB camera metadata. } param { 0.name = ~queue_size 0.type = int 0.desc = Size of message queue for each synchronized topic. 0.default = 10 1.name = ~rgb/image_transport 1.type = string 1.desc = Image transport format. 1.default = "raw" 2.name = ~depth/image_transport 2.type = string 2.desc = Depth image transport format. 2.default = "raw" } } }}} === data_throttle === Throttle at a specified rate the OpenNI data. ==== Input topics ==== * *rgb/image_in* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *depth/image_in* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *rgb/camera_info_in* [[http://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html|sensor_msgs/CameraInfo]] ==== Output topics ==== * *rgb/image_out* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.htm|lsensor_msgs/Image]] * *depth/image_out* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *rgb/camera_info_out* [[http://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html|sensor_msgs/CameraInfo]] ==== Parameters ==== * *queue_size* (int, default 10) * *max_rate* (double, default 0 Hz) * *rgb/image_transport* (string, default "raw") * *depth/image_transport* (string, default "raw") === point_cloud_xyzrgb === Construct a point cloud from RGB and depth images. An optional voxel filtering can be applied. ==== Input topics ==== * *rgb/image* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *depth/image* [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *rgb/camera_info* [[http://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html|sensor_msgs/CameraInfo]] ==== Output topics ==== * *cloud* [[http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html|sensor_msgs/PointCloud2]] ==== Parameters ==== * *queue_size* (int, default 10) * *voxel_size* (double, default 0) * *rgb/image_transport* (string, default "raw") * *depth/image_transport* (string, default "raw") === disparity_to_depth === Convert a disparity image to a depth image. ==== Input topics ==== * *disparity* [[http://docs.ros.org/api/stereo_msgs/html/msg/DisparityImage.html|stereo_msgs/DisparityImage]] ==== Output topics ==== * *depth* 32F (m) [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] * *depth_raw* 16U (mm) [[http://docs.ros.org/api/sensor_msgs/html/msg/Image.html|sensor_msgs/Image]] == RVIZ plugins == === Map Cloud Display === http://rtabmap.googlecode.com/svn/trunk/doc/rgbd/MapCloudType.png This rviz plugin subscribes to /mapData ([[rtabmap_MapData|rtabmap/MapData]]) topic. A 3D map cloud will be created incrementally in RVIZ. When the graph is changed, all point clouds added in RVIZ will be transformed to new poses. ==== Properties ==== http://rtabmap.googlecode.com/svn/trunk/doc/rgbd/MapCloudPlugin.png It has the same properties as PointCloud2 with these new ones: * *Cloud decimation* How the input depth and RGB images are decimated before creating the point cloud (default 4). * *Cloud max depth (m)* Maximum depth of each point cloud added to map (default 4, 0 means no maximum). * *Cloud voxel size (m)* Voxel size of the generated point clouds (default 0.01, 0 means no voxel). * *Node filtering radius (m)* (Disabled=0) Only keep one node in the specified radius. * *Node filtering angle (degrees)* (Disabled=0) Only keep one node in the specified angle in the filtering radius. * *Download Map* Download the map from rtabmap node. This may take a while if the map is big, be patient! === Info Display === This rviz plugin subscribes to /info ([[rtabmap_Info|rtabmap/Info]]) topic. Information about loop closures detected are shown in the "Status". |
Show EOL distros:
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: hydro-devel)
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: indigo-devel)
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: jade-devel)
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: kinetic-devel)
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: lunar-devel)
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: melodic-devel)
Package Summary
RTAB-Map's standalone library. RTAB-Map is a RGB-D SLAM approach with real-time constraints.
- Maintainer status: maintained
- Maintainer: Mathieu Labbe <matlabbe AT gmail DOT com>
- Author: Mathieu Labbe
- License: BSD
- External website: http://introlab.github.io/rtabmap
- Bug / feature tracker: https://github.com/introlab/rtabmap/issues
- Source: git https://github.com/introlab/rtabmap.git (branch: noetic-devel)
Overview
Under construction!!!! moving Wiki from googlecode wiki
Author(s): Mathieu Labbé
Maintainer(s): Mathieu Labbé
License: BSD
Web page: http://introlab.github.io/rtabmap
Source: https://github.com/introlab/rtabmap-ros-pkg
Dependencies: RTAB-Map's standalone library (https://github.com/introlab/rtabmap)
Description: RTAB-Map (Real-Time Appearance-Based Mapping) is a RGB-D Graph SLAM approach based on a global Bayesian loop closure detector. The loop closure detector uses a bag-of-words approach to determinate how likely a new image comes from a previous location or a new location. When a loop closure hypothesis is accepted, a new constraint is added to the map's graph, then a graph optimizer minimizes the errors in the map. A memory management approach is used to limit the number of locations used for loop closure detection and graph optimization, so that real-time constraints on large-scale environnements are always respected. RTAB-Map can be used alone with a hand-held Kinect or stereo camera for 6DoF RGB-D mapping, or on a robot equipped with a laser rangefinder for 3DoF mapping.
Tutorials
Demos
Robot mapping
For this demo, you will need the ROS bag demo_mapping.bag :
Launch: demo_robot_mapping.launch
$ roslaunch rtabmap demo_robot_mapping.launch $ rosbag play --clock demo_mapping.bag
Robot mapping with RVIZ
For this demo, you will also need the ROS bag demo_mapping.bag :
Launch: demo_robot_mapping_rviz.launch
$ roslaunch rtabmap demo_robot_mapping_rviz.launch $ rosbag play --clock demo_mapping.bag
Multi-session mapping
For this demo, you will need the ROS bags of five mapping sessions :
For the first launch, you can do "Edit->Delete memory" to make sure that you start from a clean memory.
Launch: demo_multi-session_mapping.launch
$ roslaunch rtabmap demo_multi-session_mapping.launch $ rosbag play --clock map1.bag $ (...) $ rosbag play --clock map2.bag $ (...) $ rosbag play --clock map3.bag $ (...) $ rosbag play --clock map4.bag $ (...) $ rosbag play --clock map5.bag
Robot mapping with Find-Object
Find-Object's ros-pkg find_object_2d should be installed.
ROS Bag: demo_find_object.bag.zip
Launch: demo_find_object.launch
$ roslaunch rtabmap demo_find_object.launch $ rosbag play --clock demo_find_object.bag
Stereo
Visit the page StereoOutdoorDemo more detailed information. It is also shown how to create 2D occupancy grid map for navigation.
Launch : demo_stereo_outdoor.launch
$ roslaunch rtabmap demo_stereo_outdoor.launch $ rosbag play --clock stereo_outdoorA.bag [...] $ rosbag play --clock stereo_outdoorB.bag
Appearance-based loop closure detection-only
Launch: demo_appearance_mapping.launch
$ roslaunch rtabmap demo_appearance_mapping.launch
The GUI shows a plenty of information about the loop closures detected. If you only need the ID of the matched past image of a loop closure, you can do that:
$ rostopic echo /rtabmap/info/loopClosureId 6 --- 0 --- 7 ---
A "0" means no loop closure detected. This can be also used in localization mode:
$ roslaunch rtabmap demo_appearance_localization.launch
Nodes
rtabmap
This is the main node of this package. It is a wrapper of the RTAB-Map library. This is where the graph of the map is incrementally built and optimized when a loop closure is detected. The online output of the node is the local graph with the latest added data to the map (as well with RTAB-Map statistics). The default localization of the RTAB-Map database is "~/.ros/rtabmap.db" and the workspace is also set to "~/.ros" (use RTAB-Map parameters Rtabmap/DatabasePath and Rtabmap/WorkingDirectory to change the defaults). Argument "--delete_db_on_start" will make rtabmap to delete the database before starting, otherwise the previous mapping session is loaded.
There are two set of parameters: ROS and RTAB-Map parameters. The ROS parameters are for connection stuff to interface the RTAB-Map library with ROS. The RTAB-Map parameters are those from the RTAB-Map library.
One way to know RTAB-Map library parameters is to look at this file : Parameters.h. There is a description for each parameter. Here an example (use string type for all RTAB-Map parameters):
<launch> <node name="rtabmap" pkg="rtabmap" type="rtabmap"> <param name="RGBD/ToroIterations" type="string" value="50"/> </node> </launch>
Another way to show available parameters from the terminal is to call the node with "--params" argument:
$ rosrun rtabmap rtabmap --params
Subscribed Topics
odom (nav_msgs/Odometry)- Odometry stream. Required if parameters subscribe_depth or subscribe_stereo are true.
- RGB/Mono rectified image.
- RGB camera metadata.
- Registered depth image.
- Laser scan stream. Required if parameter subscribe_laserScan is true.
Published Topics
info (rtabmap/Info)- RTAB-Map's basic info.
- RTAB-Map's extended info.
- RTAB-Map's graph and latest node data.
Services
get_map (rtabmap/GetMap)- Call this service to get the map data
- The node will update with current parameters of the rosparam server
- Delete the map
- Pause mapping
- Resume mapping
- The node will begin a new map
- Set localization mode
- Set mapping mode
Parameters
~subscribe_depth (bool, default: "true")- Subscribe to depth image
- Subscribe to laser scan
- Subscribe to stereo images
- The frame attached to the mobile base.
- The frame attached to the map.
- Size of message queue for each synchronized topic.
- Rate at which the TF from /map to /odom is published (20 Hz).
- Image transport format.
- Depth image transport format.
Required tf Transforms
base_link → <the frame attached to sensors of incoming data>- usually a fixed value, broadcast periodically by a robot_state_publisher, or a tf static_transform_publisher.
- usually provided by the odometry system (e.g., the driver for the mobile base).
Provided tf Transforms
map → odom- the current estimate of the robot's pose within the map frame.
rtabmapviz
This node starts the visualization interface of RTAB-Map. It is a wrapper of the RTAB-Map GUI library. It has the same purpose as rviz but with specific options for RTAB-Map.
Arguments
- *-d "rtabmap.ini"*
- Set a RTAB-Map ini file for GUI interface parameters.
Subscribed Topics
odom (nav_msgs/Odometry)- Odometry stream. Required if parameters subscribe_depth, subscribe_laserScan or subscribe_stereo are true.
- RGB/Mono rectified image.
- RGB camera metadata. Required if parameter subscribe_depth is true.
- Registered depth image. Required if parameter subscribe_depth is true.
- Laser scan stream. Required if parameter subscribe_laserScan is true.
- RTAB-Map's extended info.
- RTAB-Map's graph and latest node data.
Published Topics
info (rtabmap/Info)- RTAB-Map's basic info.
- RTAB-Map's extended info.
- RGB camera metadata.
Parameters
~subscribe_depth (bool, default: "true")- Subscribe to depth image
- Subscribe to laser scan
- The frame attached to the mobile base.
- Size of message queue for each synchronized topic.
- Image transport format.
- Depth image transport format.
Required tf Transforms
base_link → <the frame attached to sensors of incoming data>- usually a fixed value, broadcast periodically by a robot_state_publisher, or a tf static_transform_publisher.
- usually provided by the odometry system (e.g., the driver for the mobile base).
- the current estimate of the robot's pose within the map frame.
rgbd_odometry
This node wraps the visual odometry approach of RTAB-Map. Using RGBD images, odometry is computed using FAST/SIFT/SURF features extracted from the RGB images with their depth information from the depth images. Using the feature correspondences between the images, a RANSAC approach compute the transformation between the consecutive images. The approach is visual based so enough visual features must be present to compute the odometry. When it fails to update the odometry, a null transformation is sent to notify the receiver that odometry is not updated.
There are two set of parameters: ROS and RTAB-Map parameters. The ROS parameters are for connection stuff to interface the RTAB-Map library with ROS. The RTAB-Map parameters are those from the RTAB-Map library.
See these examples in sample launch files:
Parameters available for odometry can be shown from the terminal by using the "--params" argument:
$ rosrun rtabmap visual_odometry --params
Subscribed Topics
rgb/image (sensor_msgs/Image)- RGB/Mono rectified image.
- RGB camera metadata.
- Registered depth image.
Published Topics
odom (nav_msgs/Odometry)- Odometry stream. Send null odometry if cannot be computed.
Services
reset_odom (std_srvs/Empty)- Restart odometry to identity transformation.
Parameters
~subscribe_depth (bool, default: "true")- Subscribe to depth image
- Subscribe to laser scan
- The frame attached to the mobile base.
- Size of message queue for each synchronized topic.
- Image transport format.
- Depth image transport format.
Required tf Transforms
base_link → <the frame attached to sensors of incoming data>- usually a fixed value, broadcast periodically by a robot_state_publisher, or a tf static_transform_publisher.
Provided tf Transforms
odom → base_link- the current estimate of the robot's pose within the odometry frame.
camera
A node for image acquisition from an USB camera (OpenCV is used). A special option for this node is that it can be configured to read images from a directory or a video file. Parameters can be changed with the *dynamic_reconfigure* GUI from ROS.
Output topics
*image* sensor_msgs/Image
Parameters
- *frame_id* (string, default "camera")
For dynamic parameters, see Camera.cfg
data_recorder
This node can be used to record data from an OpenNI camera to RTAB-Map database format (".db"), which can be read after by the standalone application (Detection->Select Source->Database). It is like a ROS bag, but timing information are not saved.
Input topics
*odom* nav_msgs/Odometry
- required if parameters *subscribe_odometry* is true
*scan* sensor_msgs/LaserScan
- required if parameter *subscribe_laserScan* is true
*rgb/image* sensor_msgs/Image
*depth/image* sensor_msgs/Image
- required if parameter *subscribe_depth* is true
*rgb/camera_info* sensor_msgs/CameraInfo
- required if parameter *subscribe_depth* is true
Parameters
- *frame_id* (string, default "base_link")
- *subscribe_odometry* (bool, default false)
- *subscribe_depth* (bool, default false)
- *subscribe_laserScan* (bool, default false)
- *queue_size* (int, 10)
- synchronization queue size
- *output_file_name* (string, default "output.db")
- *rgb/image_transport* (string, default "raw")
- *depth/image_transport* (string, default "raw")
map_assembler
This node subscribes to *rtabmap* output topic *mapData* and assembles the 3D map incrementally, then publishes the assembled map.
Input topics
*mapData* rtabmap/MapData
Output topics
*assembled_clouds* sensor_msgs/PointCloud2
*assembled_scans* sensor_msgs/PointCloud2
Parameters
- *cloud_decimation* (int, default 4)
- *cloud_max_depth* (double, default 4.0)
- *cloud_voxel_size* (double, default 0.02)
- *scan_voxel_size* (double, default 0.01)
grid_map_assembler
This node subscribes to *rtabmap* output topic *mapData* and assembles the map incrementally using the laser scans, then publishes the occupancy grid map.
Input topics
*mapData* rtabmap/MapData
Output topics
*grid_map* nav_msgs/OccupancyGrid
Parameters
- *scan_voxel_size* (double, default 0.01)
Services
*get_grid_map* nav_msgs/GetMap
Nodelets
rtabmap/data_odom_sync
Synchronize odometry with RGB-D images. Useful to correctly show clouds in RVIZ when odometry refresh rate is low comparatively to clouds to show.Subscribed Topics
odom_in (nav_msgs/Odometry)- Odometry stream.
- RGB image stream.
- Registered depth image stream.
- RGB camera metadata.
Published Topics
odom_out (nav_msgs/Odometry)- Odometry stream.
- RGB image stream.
- Registered depth image stream.
- RGB camera metadata.
Parameters
~queue_size (int, default: 10)- Size of message queue for each synchronized topic.
- Image transport format.
- Depth image transport format.
data_throttle
Throttle at a specified rate the OpenNI data.
Input topics
*rgb/image_in* sensor_msgs/Image
*depth/image_in* sensor_msgs/Image
*rgb/camera_info_in* sensor_msgs/CameraInfo
Output topics
*rgb/image_out* lsensor_msgs/Image
*depth/image_out* sensor_msgs/Image
*rgb/camera_info_out* sensor_msgs/CameraInfo
Parameters
- *queue_size* (int, default 10)
- *max_rate* (double, default 0 Hz)
- *rgb/image_transport* (string, default "raw")
- *depth/image_transport* (string, default "raw")
point_cloud_xyzrgb
Construct a point cloud from RGB and depth images. An optional voxel filtering can be applied.
Input topics
*rgb/image* sensor_msgs/Image
*depth/image* sensor_msgs/Image
*rgb/camera_info* sensor_msgs/CameraInfo
Output topics
*cloud* sensor_msgs/PointCloud2
Parameters
- *queue_size* (int, default 10)
- *voxel_size* (double, default 0)
- *rgb/image_transport* (string, default "raw")
- *depth/image_transport* (string, default "raw")
disparity_to_depth
Convert a disparity image to a depth image.
Input topics
*disparity* stereo_msgs/DisparityImage
Output topics
*depth* 32F (m) sensor_msgs/Image
*depth_raw* 16U (mm) sensor_msgs/Image
RVIZ plugins
Map Cloud Display
http://rtabmap.googlecode.com/svn/trunk/doc/rgbd/MapCloudType.png
This rviz plugin subscribes to /mapData (rtabmap/MapData) topic. A 3D map cloud will be created incrementally in RVIZ. When the graph is changed, all point clouds added in RVIZ will be transformed to new poses.
Properties
http://rtabmap.googlecode.com/svn/trunk/doc/rgbd/MapCloudPlugin.png
It has the same properties as PointCloud2 with these new ones:
- *Cloud decimation* How the input depth and RGB images are decimated before creating the point cloud (default 4).
- *Cloud max depth (m)* Maximum depth of each point cloud added to map (default 4, 0 means no maximum).
- *Cloud voxel size (m)* Voxel size of the generated point clouds (default 0.01, 0 means no voxel).
- *Node filtering radius (m)* (Disabled=0) Only keep one node in the specified radius.
- *Node filtering angle (degrees)* (Disabled=0) Only keep one node in the specified angle in the filtering radius.
- *Download Map* Download the map from rtabmap node. This may take a while if the map is big, be patient!
Info Display
This rviz plugin subscribes to /info (rtabmap/Info) topic. Information about loop closures detected are shown in the "Status".