Pytorch implementation for Multimodal Sentiment Analysis with Preferential Fusion and Distance-aware Contrastive Learning (ICME 2023 Oral).
We provide the anaconda enviroments to help you build a runnable environment.
conda env create -f environments.yml
conda activate prisa
Please download MOSEI dataset into ./data
for training and evaluation.
Please download bert-base-uncased from huggingface into ./pretrained_models
.
You can train the model using the following command. The output will be saved at /tmp/log/
, you can modify .runx
to change the path of training log.
python -m runx.runx mosei.yml -i
Or you can directly run the following command without runx, the output will be saved at ./log
python main.py
Feel free to concat with us ([email protected]) if you have any problem.
@inproceedings{ma2023multimodal,
title={Multimodal Sentiment Analysis with Preferential Fusion and Distance-aware Contrastive Learning},
author={Ma, Feipeng and Zhang, Yueyi and Sun, Xiaoyan},
booktitle={2023 IEEE International Conference on Multimedia and Expo (ICME)},
pages={1367--1372},
year={2023},
organization={IEEE}
}