Skip to content

Official implementation of our LREC-COLING 2024 paper "Generative Multimodal Entity Linking".

Notifications You must be signed in to change notification settings

HITsz-TMG/GEMEL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GEMEL: Generative Multimodal Entity Linking

✨ Overview

This repository contains the official implementation of our LREC-COLING 2024 paper, Generative Multimodal Entity Linking.

GEMEL is a simple yet effective Generative Multimodal Entity Linking framework based on Large Language Models (LLMs), which directly generates target entity names. We keep the vision and language model frozen and only train a feature mapper to enable cross-modality interactions. Extensive experiments show that, with only ~0.3% of the model parameters fine-tuned, GEMEL achieves state-of-the-art results on two well-established MEL datasets, namely WikiDiverse and WikiMEL. The performance gain stems from mitigating the popularity bias of LLM predictions and disambiguating less common entities effectively. Our framework is compatible with any off-the-shelf language model, paving the way towards an efficient and general solution for utilizing LLMs in the MEL task.

Checkpoints and preprocessed data can be accessed here.

If you have any question, please feel free to contact me via email at [email protected] or submit your issue in the repository.

🔥 News

[23.07.14] We release the codes and the checkpoints of GEMEL.

[24.03.19] We have updated our paper.

🚀 Architecture

Here, you can see the detailed architecture and some experimental analyses of GEMEL.

GEMEL

🚨 Usage

Environment

conda create -n GEMEL python=3.7
conda activate GEMEL
pip install -r requirements.txt

For different CUDA versions you need to install the corresponding PyTorch package. Find the appropriate installation package on the PyTorch website. To install PyTorch, we use the following command:

pip install torch==1.12.0+cu116 torchvision==0.13.0+cu116 torchaudio==0.12.0 --extra-index-url https://download.pytorch.org/whl/cu116

Data

We have preprocessed the text, image, and knowledge base data. Download data from here and move to the ./data folder. Here we offer guidelines on how to build and use a prefix tree for constrained decoding.

train.json, dev.json, test.json         ->      textual data files
clip_vit_large_patch14_1024.hdf5        ->      visual data file
prefix_tree_opt.pkl                     ->      prefix tree of entity name
SimCSE_train_mention_embeddings.pkl     ->      training set mention embeddings

Train

Running main.py directly will use the WikiDiverse dataset, opt-6.7b model:

python main.py

The model structure is in model.py, the default parameters are in params.py, and most of the data processing is in utils.py.

You can customize some parameter settings, see params.py for details. Here are some examples of how to train GEMEL:

For training with the WikiDiverse dataset:

python main.py --dataset wikidiverse --model_name opt-6.7b --ICL_examples_num 16

For training with the WikiMEL dataset:

python main.py --dataset wikimel --model_name opt-6.7b --ICL_examples_num 16

Test

Download the checkpoint from here and move to the ./checkpoint folder.

For testing on WikiDiverse test set:

python infe.py --dataset wikidiverse --model_name opt-6.7b --best_ckpt opt-6.7b_wikidiverse_linear_4token_16examples_82_77.pkl

For testing on WikiMEL test set:

python infe.py --dataset wikimel --model_name opt-6.7b --best_ckpt opt-6.7b_wikimel_linear_4token_16examples_75_53.pkl

Citation

@article{shi2023generative,
  title={Generative Multimodal Entity Linking},
  author={Shi, Senbao and Xu, Zhenran and Hu, Baotian and Zhang, Min},
  journal={arXiv preprint arXiv:2306.12725},
  year={2023}
}

About

Official implementation of our LREC-COLING 2024 paper "Generative Multimodal Entity Linking".

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages