Efficient LoFTR: Semi-Dense Local Feature Matching with Sparse-Like Speed
Yifan Wang*, Xingyi He*, Sida Peng, Dongli Tan, Xiaowei Zhou
CVPR 2024
realtime_demo.mp4
- Inference code and pretrained models
- Code for reproducing the test-set results
- Add options of flash-attention and torch.compiler for better performance
- jupyter notebook demo for matching a pair of images
- Training code
conda env create -f environment.yaml
conda activate eloftr
pip install torch==2.0.0+cu118 --index-url https://download.pytorch.org/whl/cu118
pip install -r requirements.txt
The test and training can be downloaded by download link provided by LoFTR
We provide the our pretrained model in download link
You need to setup the testing subsets of ScanNet and MegaDepth first. We create symlinks from the previously downloaded datasets to data/{{dataset}}/test
.
# set up symlinks
ln -s /path/to/scannet-1500-testset/* /path/to/EfficientLoFTR/data/scannet/test
ln -s /path/to/megadepth-1500-testset/* /path/to/EfficientLoFTR/data/megadepth/test
conda activate eloftr
bash scripts/reproduce_test/indoor_full_time.sh
bash scripts/reproduce_test/indoor_opt_time.sh
conda activate eloftr
bash scripts/reproduce_test/outdoor_full_auc.sh
bash scripts/reproduce_test/outdoor_opt_auc.sh
bash scripts/reproduce_test/indoor_full_auc.sh
bash scripts/reproduce_test/indoor_opt_auc.sh
The Training code is coming soon, please stay tuned!
If you find this code useful for your research, please use the following BibTeX entry.
@inproceedings{wang2024eloftr,
title={{Efficient LoFTR}: Semi-Dense Local Feature Matching with Sparse-Like Speed},
author={Wang, Yifan and He, Xingyi and Peng, Sida and Tan, Dongli and Zhou, Xiaowei},
booktitle={CVPR},
year={2024}
}