-
Notifications
You must be signed in to change notification settings - Fork 11
Tutorial 5: Autonomous Mapping
samaahitabelavadi edited this page Jul 5, 2017
·
16 revisions
-
Required ROS Packages
- For using Realsense R200 camera in Gazebo
- realsense_gazebo_plugin [2] - [1]
- For registering the depth image to colour image frame
- depth_image_proc - [1]
- image_common - [1]
- For generation of point clouds from colour and registered depth image
- rtabmap_ros - sudo apt-get install ros-indigo-rtabmap-ros
- For frontier based exploration
- autonomous_exploration - [1]
- For path planning
- move_it - sudo apt-get install ros-indigo-moveit
- For keyboard control of the ARDrone
- cvg_sim_gazebo - [1]
- For using Realsense R200 camera in Gazebo
-
Required Files
- ardrone_realsense.launch (realsense_gazebo_plugin/launch) - Launch Gazebo7 and load the simulated world
- register.launch (realsense_gazebo_plugin/launch) - Launch a nodelet to register the depth image to the colour image frame
- pub_camera_info.py (realsense_gazebo_plugin/scripts) - Run a script to publish fake depth and color camera info for the simulated cameras
- rtabmap_pcl.launch (realsense_gazebo_plugin/launch) - Launch nodelet to convert depth and color image to pointcloud
- ardrone_get_odometry.py (cvg_sim_gazebo/scripts) - Run a script to fetch pose of the ardrone in Gazebo and publish the tf
- moveit.launch (roslaunch move_it/launch) - Launch the MoveIt! path planner
- send_goal.py (realsense_gazebo_plugin/scripts) - Start the script to send goals to MoveIt!
- server.py (realsense_gazebo_plugin/scripts) - Start the actionlib server to execute the waypoints from MoveIt!
- client.py (realsense_gazebo_plugin/scripts) - Start the actionlib client to execute the waypoints from MoveIt!
- keyboard.py (cvg_sim_gazebo/scripts) - Start the keyboard tele-op to control the drone in Gazebo
- fbet.launch (realsense_gazebo_plugin/launch) - Launch the node to generate goals for autonomous mapping
-
Procedure
- Launch ardrone_realsense.launch present in realsense_gazebo_plugin package to start the simulated world along which has the ARDrone with a Reaalsense R200 camera mounted on it
- Launch register.launch present in realsense_gazebo_plugin package to register the depth image stream to the colour image stream
- Run the script pub_camera_info.py to publish fake camera metadata for the simulated Realsense R200 camera
- Launch rtabmap_pcl.launch present in realsense_gazebo_plugin package to generate point clouds from the depth and colour images
- Run the script ardrone_get_odometry.py present in cvg_sim_gazebo package to fetch pose of the ardrone in Gazebo and to publish the corresponding transform (tf)
- Launch moveit.launch present in move_it package to start the MoveIt! path planner
- Run the scripts server.py and client.py present in realsense_gazebo_plugin package to start the actionlib server and client to execute the waypoints from MoveIt!
- Launch fbet.launch present in realsense_gazebo_plugin package to generate goals for autonomous mapping
- Start the ARDrone in simulation using the keyboard tele-op script (keyboard.py) present in cvg_sim_gazebo package
-
Appendix
- Github Link for packages - https://github.com/eYSIP-2017/eYSIP-2017_Indoor-Environments-Mapping-using-UAV
- Link to install Gazebo7 - https://github.com/eYSIP-2017/eYSIP-2017_Indoor-Environments-Mapping-using-UAV/blob/master/bash_scripts/install_gazebo7.sh
- Bash Script to launch all nodes and scripts for autonomous mapping - https://github.com/eYSIP-2017/eYSIP-2017_Indoor-Environments-Mapping-using-UAV/blob/master/bash_scripts/autonomous_mapping.sh
- Full video link - https://youtu.be/kXyV3OpbWo8
- Video for explanation of the autonomous mapping algorithm - https://youtu.be/Ow4pZlDPhkY