Skip to content

Codes for "TriSAT: Trimodal Representation Learning for Multimodal Sentiment Analysis".

License

Notifications You must be signed in to change notification settings

gw-zhong/TriSAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Python 3.8

Codes for TriSAT: Trimodal Representation Learning for Multimodal Sentiment Analysis (Accepted by IEEE/ACM Transactions on Audio, Speech and Language Processing).

Usage

Clone the repository

git clone https://github.com/gw-zhong/TriSAT.git

Download the datasets and BERT models

Alternatively, you can download these datasets from:

For convenience, we also provide the BERT pre-training model that we fine-tuned with:

Preparation

Create (empty) folders for data, results, and models:

cd TriSAT
mkdir input results models

and put the downloaded data in 'input/'.

Run the code

python main_[DatasetName].py [--FLAGS]

Citation

Please cite our paper if you find that useful for your research:

@article{huan2024trisat,
  title={TriSAT: Trimodal Representation Learning for Multimodal Sentiment Analysis},
  author={Huan, Ruohong and Zhong, Guowei and Chen, Peng and Liang, Ronghua},
  journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
  year={2024},
  publisher={IEEE}
}

Contact

If you have any question, feel free to contact me through [email protected].

About

Codes for "TriSAT: Trimodal Representation Learning for Multimodal Sentiment Analysis".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages