In this tutorial we introduce the demo of OnePose running with data captured with our OnePose Cap application available for iOS device. The app is still under preparing for release. However, you can try it with the sample data and skip the first step.
- Export the collected mapping sequence and the test sequence to the PC.
- Rename the annotate and test sequences directories to
your_obj_name-annotate
andyour_obj_name-test
respectively and organize the data as the follow structure:Refer to the sample data as an example.|--- /your/path/to/scanned_data | |--- your_obj_name | | |---your_obj_name-annotate | | |---your_obj_name-test
- Link the collected data to the project directory
REPO_ROOT=/path/to/OnePose ln -s /path/to/scanned_data $REPO_ROOT/data/demo
Now the data is prepared!
Download the pretrained OnePose model and move it to ${REPO_ROOT}/data/model/checkpoints/onepose/GATsSPG.ckpt
.
[Optional] To run OnePose with tracking modeule, pelase install DeepLM.
Please make sure the sample program in DeepLM
can be correctly executed to ensure successful installation.
Execute the following commands, and a demo video naming demo_video.mp4
will be saved in the folder of the test sequence.
REPO_ROOT=/path/to/OnePose
OBJ_NAME=your_obj_name
cd $REPO_ROOT
conda activate OnePose
bash scripts/demo_pipeline.sh $OBJ_NAME
# [Optional] running OnePose with tracking
export PYTHONPATH=$PYTHONPATH:/path/to/DeepLM/build
export TORCH_USE_RTLD_GLOBAL=YES
bash scripts/demo_pipeline.sh $OBJ_NAME --WITH_TRACKING