Simulation is to test our methods to robots in a safe way and help debug before real-time experiments.
Here, we use PX4-Gazebo to test auto flights. PX4 provides an Offboard mode for auto flights that allow onboard computers to take control of drones.
This is to create a simulation case where drone navigation information is from GPS and IMU.
First read prearm,arm and disarm specified by PX4. How can I comment on this: it is not better than shit.
Arming and disarming decides what cases will stop drones. One of the cases is that should a drone be stopped, or disarmed, if it cannot receive RC signals for a certain amount of time.
- allow arming drones without RC signal
We allow drones armed in offboard modes by setting
COM_RCL_EXCEPT=4
like
- change timeout parameters Other parameters can be set to allow more time to take actions
COM_OF_LOSS_T = 10
COM_DISARM_LAND =10
COM_DISARM_LAND = -1 means disable this
There are several libraries developed for PX4 drones.
Here are some good reps:
- Jaeyoung-Lim/mavros_controllers, https://github.com/Jaeyoung-Lim/mavros_controllers
- uzh-rpg/rpg_quadrotor_control, https://github.com/uzh-rpg/rpg_quadrotor_control
- start by creating a px4-drone model in simulation in Gazebo
roslaunch px4 posix_sitl.launch
- call mavros to enable communication between PX4 and PC throught ROS
roslaunch mavros px4.launch fcu_url:="udp://:[email protected]:14557"
- apply controller, for instance geometric_controller, to control drone fly
roslaunch geometric_controller sitl_trajectory_track_circle.launch
Note: check if controllers to be run already call mavros or not.
This simulation case is to simulate drone navigation information is from external vision system, like Vicon and Qualisys.
Let us take Vicon for example.
Therefore, we need to configure PX4 such that it takes navigation information from Vicon instead of GPS or IMU. Tutorials are given by PX4, i.e.EKF2 Tuning/Configuration, Using Vision or Motion Capture Systems for Position Estimation
- setting
EKF2_AID_MASK = 3: vision position fusion + 4: vision yaw fusion
, we set position information is from Vicon.
- setting
EKF2_HGT_MODE = 3: Vision
, we set height information is also from Vicon
If you dont want to read details, I have provided packages for you drone_simulation_tools.
What you need to do is
- clone the packages into
catkin_src
git clone [email protected]:EEEManchester/drone_simulation_tools.git
- build the packages
catkin build
- run launch file
roslaunch drone_simulation_tools drone_sim_vision_map_mavros.launch
Finally, two nodes apart from PX4 and Mavros will be running:
- package
fake_qualisys
fakes a Vicon sytem taking drone pose information from Gazebo and publishes to topics in a Vicon way - package
mocap_to_mavros_sim
takes drone pose information from Vicon (through a topic) and publish to/mavros/vision_pose/pose
feeding Vicon information to drone.