Skip to content

Codebase for CHI 2025 Late-Breaking Work Paper "Exploring the Impact of Drivers' Emotion and Multi-task Learning on Takeover Behavior Prediction in Multimodal Environment"

License

Notifications You must be signed in to change notification settings

MKMaS-GUET/Multi-TBP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-TBP

Codebase for CHI 2025 Late-Breaking Work Paper:

Exploring the Impact of Drivers' Emotion and Multi-task Learning on Takeover Behavior Prediction in Multimodal Environment

Model Architecture:

modelArchitecture

The code was refactored to integrate all experiments and baselines; please contact me if you find any bugs. Thanks.

Content

Data Preparation

  • This study selected the EmoTake dataset to train Multi-TBP, which can be downloaded through this link.
  • We have placed the EmoTake dataset in the "data" folder and the dataset description (README) is also included.

Environment

The paper's basic training environment for its results is Python 3.8, Pytorch 1.9.0 with a single NVIDIA RTX 3090. Notably, different hardware and software environments can cause the results to fluctuate.

Running

  1. For the dataset or task being executed, you can adjust necessary parameters in "opts.py", such as "--datasetName", "--labelType", "--num_class", or "--data_path", etc.

  2. After adjusting the parameters, you can run this project with the following command:

    python ./src/train.py
  3. The output results will be saved in the "log" folder.

Note

  1. Since Multi-TBP is not limited by the number of modalities, input data dimensions, and fusion data dimensions, it can be changed uniformly. Therefore, Multi-TBP can be extended to other datasets or applied to new scenarios according to your needs.

  2. If you want to change the dataset or usage scenario, please update the parameters in "opts.py".

  3. We gratefully acknowledge the help of open-source projects used in this work 🎉🎉, including EmoTake, etc 😄.

Citation

Paper publication address:

Exploring the Impact of Drivers' Emotion and Multi-task Learning on Takeover Behavior Prediction in Multimodal Environment

Please cite our paper if you find it having other limitations and valuable for your research (卑微求引用 T^T) :

@inproceedings{feng2025exploring,
  title={Exploring the Impact of Drivers' Emotion and Multi-task Learning on Takeover Behavior Prediction in Multimodal Environment: Exploring Takeover Behavior Prediction in Multimodal Environment},
  author={Feng, Xinyu and Gu, Yu and Lin, Yuming and Cai, Yaojun},
  booktitle={Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems},
  pages={1--8},
  year={2025}
}

About

Codebase for CHI 2025 Late-Breaking Work Paper "Exploring the Impact of Drivers' Emotion and Multi-task Learning on Takeover Behavior Prediction in Multimodal Environment"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages