📦tbai
┣ 📂tbai_ros_static # Static (high gain PD) controller
┣ 📂tbai_ros_mpc # NMPC controller (both perceptive and blind versions) [1]
┣ 📂tbai_ros_bob # RL walking controller, based on the wild-Anymal paper (both perceptive and blind versions) [2]
┣ 📂tbai_ros_dtc # DTC controller (perceptive) [3]
┣ 📂tbai_ros_joe # Perceptive NMPC controller with NN-based tracking controller [1], [3]
[1] Perceptive Locomotion through Nonlinear Model Predictive Control
https://arxiv.org/abs/2208.08373
[2] Learning robust perceptive locomotion for quadrupedal robots in the wild
https://arxiv.org/abs/2201.08117
[3] DTC: Deep Tracking Control
https://arxiv.org/abs/2309.15462
To install tbai_ros, we recommend using pixi, though tbai_ros is a full-fledged ROS package and it can be integrated into your projects in using conventional tools and methods. We use pixi for reproducibility. Don't worry that ROS is past its end of life, pixi (or micromamba) will install everything for you (even on the newest Ubuntu release) 😮
# Install pixi
curl -fsSL https://pixi.sh/install.sh | sh # You might have to source your config again
# Install tbai_ros
mkdir -p ros/src && cd ros/src
git clone https://github.com/lnotspotl/tbai_ros.git --recursive && cd tbai_ros
pixi install && pixi shell --environment all-gpu-free
just fresh-install-all-gpu-free# Install micromamba
"${SHELL}" <(curl -L micro.mamba.pm/install.sh) # You might have to source your config again
# Clone tbai_ros
mkdir -p ros/src && cd ros/src
git clone https://github.com/lnotspotl/tbai_ros.git --recursive && cd tbai_ros
# Create conda environment
micromamba env create -f .conda/all-gpu-free.yaml
micromamba activate all-gpu-free
# Install tbai_ros
just fresh-install-all-gpu-freeOnce the installation is complete, you can run one of our many examples, for instance:
# Activate pixi environment
pixi shell --environment all-gpu-free
# Run NP3O example
source $(catkin locate)/devel/setup.bash && roslaunch tbai_ros_np3o simple_go2.launch gui:=true
# Try out other examples located under tbai_ros_mpc, tbai_ros_bob, tbai_ros_dtc, tbai_ros_joe and tbai_ros_np3oCheck out the tbai_ros_deploy_go2_rl folder for deployment-related documentation, pictures and videos 🤗
mpc_perceptive_f.mp4
mpc_go2_blind.webm
rl_perceptive_fe.mp4
rl_blind_fe.mp4
dtc_f.mp4
joe_f.mp4
This project stands on the shoulders of giants. None of this would have been possible were it not for many amazing open-source projects. Here are a couple that most inspiration was drawn from and that were instrumental during the development:
- https://github.com/leggedrobotics/ocs2
- https://github.com/qiayuanl/legged_control
- https://github.com/leggedrobotics/legged_gym
- https://github.com/leggedrobotics/rsl_rl
- https://github.com/ANYbotics/elevation_mapping
- https://github.com/leggedrobotics/elevation_mapping_cupy
- https://github.com/bernhardpg/quadruped_locomotion
- https://github.com/stack-of-tasks/pinocchio
- https://x-io.co.uk/open-source-imu-and-ahrs-algorithms/
- https://github.com/mayataka/robotoc
- https://github.com/mayataka/legged_state_estimator
- https://github.com/RossHartley/invariant-ekf
- https://github.com/dfki-ric-underactuated-lab/dfki-quad
- https://github.com/iit-DLSLab/muse
- https://github.com/zeonsunlightyu/LocomotionWithNP3O
- http://www.michaelsebek.cz/cs
- hundreds of others ...
Thank you all 🤗




