-
Notifications
You must be signed in to change notification settings - Fork 32
How to use Demo
To demonstrate the operation of the rhem_planner stack, a set of demo configuration and launch files are provided under the rhem_demo folder.
- icra17_visensor.info : configuration settings for the Exploration & the Propagation pipelines of the rovio_bsp package.
- icra17_visensor_cam[0/1].yaml : [left/right] camera calibration files.
- icra17_bsp_settings.yaml : settings for the planner.
- rhem_demo_icra17.launch : launch file for the rhem_planner stack demo.
These settings correspond to the experimental setup used for the ICRA 2017 paper "Uncertainty-aware Receding Horizon Exploration and Mapping Using Aerial Robots".
Also, a demo dataset file with camera images, inertial data, and a stereo-based pointclouds is provided to be used with the rhem_planner demo:
The dataset file corresponds to only the visual-&-inertial data of the full ICRA 2017 dataset (which you can dowload here) that contains also the planning and mapping results as they occurred during the original experiment.
To launch the demo, first download icra17_visensor_data.bag and place it under the rhem_demo folder of the stack. Navigate to the source folder of your ros workspace and:
cd rhem_planner
wget -P rhem_demo https://www.cse.unr.edu/%7Ekalexis/datasets/icra2017-datasets/icra17_visensor_data.bagThen launch the rhem_demo_icra17.launch file:
roslaunch rhem_demo/rhem_demo_icra17.launchThe rhem_demo_icra17.launch will automatically perform the following:
- Launch rviz with the provided rhem.rviz visualization options, and setup ROS parameters, such as use_sim_time.
- Play back the provided visual-inertial icra17_visensor_data.bag dataset.
- Pass the respective experimental setup configuration settings and launch rhem_exploration.launch.
At the beginning you will see the results of the Estimation pipeline, namely the basic transforms, the camera frustum (a single stereo-depth camera is considerered for simplicity), and bearing arrows to the 3D-landmark positions (expected values derived by the features).
You will also see the octomap result of occupancy mapping based on the pointclouds. Alternatively, you may also check the stereo_pointcloud topic to directly see the resulting world-registered pointclouds.
During this demo, real visual-inertial data (camera images, imu, and pointclouds) are coming from what actually happened during the recorded experiment. Therefore offline planning cannot affect the evalution of the exploration and mapping process.
However, the RHEM planning pipeline can be triggered at any point. While the rhem_demo_icra17.launch file is running you can do this by calling the respective ROS service in a terminal:
rosservice call /bsp_planner '{header: {stamp: now, frame_id: world}}'This will invoke the 2-layer planning process as described in this wiki:
This is the Next-Best-View volumetric exploration & probabilistic reobservation layer. The respective topics are:
- nbv_rrt : The full set of RRT-sampled trajectories.
- nbv_best : The best cumulative information gain trajectory that is selected.
- nbv_stats : Annotation of each viewpoint-vertex with statistics (black: cumulative trajectory gain up to this node, blue: unmapped volume exploration gain of this node, green: mapped volume re-observation gain of this node, and some other extra fields, see rrt.hpp for details).
This is the Uncertainty-aware Planning layer. This takes the first edge of nbv_best and resamples random trajectories in its vicinity, as visualized by:
- bsp_workspace : The replanning workspace, a 3D-ellipsoid around and along the first edge of nbv_best.
The replanning layer topics are:
- bsp_rrt : The full set of RRT-sampled trajectories.
- bsp_best : The best uncertainty-optimality gain trajectory that is selected.
- bsp_stats : Annotation of each viewpoint-vertex with statistics (black: D-optimality metric gain up to this node, and some other extra fields, invisible, see rrt.hpp for details).
Also, for every edge a Propagation pipeline call is triggered, in order to calculate the resulting D-optimality metric. The respective topics are:
- bsp_imu : The forward-simulated transform (corresponds to the /imu transform of the Estimation pipeline).
- bsp_frustum : The respective camera frustum.
- bsp_bearing_arrows : The respective bearing arrows to the locally tracked 3D-landmarks. Color represents expected Line-of-Sight visibility (green: visible, red: occluded by octomap structure, blue: ray crosses unknown voxels - might be occluded, purple: both occluded and crosses unknown voxels).
Contact details: Christos Papachristos, Shehryar Khattak, Kostas Alexis, Autonomous Robots Lab