Description
Hi there,
For my project, I wish to deploy this on a mobile base. For now, I am assuming that the camera is fixed at the end of a mobile manipulator. I am using gazebo + move_base to control a 2D mobile base. I am currently making changes to the code such that the conversion from Eigen to MultiDofJointTrajectoryPoint is now Eigen to MoveBaseGoal.
I want to ensure next that the output Eigen value from the planner is essentially a trajectory of x, y and a heading of the mobile base. I can also make use of ros tf to transform to the camera. At this point I don't particularly care about velocity, acceleration and any other information in a hope that move_base will take care of it. e.g.) can I only sample in 2D dimension so that the trajectory is generated just for 2D? Before I dig too deep into it, I wanted to ask if it is feasible, and any tips for me to continue down this way.
Thank you