Yuzhang Shang, Dan Xu, Bin Duan, Ziliang Zong, Liqiang Nie, and Yan Yan
The code for the Lipschitz Continuity Retained Binary Neural Network, which has been accepted to ECCV 2022.
(a) An overview of our Lipschitz regularization for a binary convolutional layer: regularizing the BNN via aligning the Lipschitz constants of binary network and its latent full-precision counterpart is the goal of our work. To reach this goal, the input and output activations of the
First, download our repo:
git clone https://github.com/42Shawn/LCR_BNN.git
cd LCR_BNN/imagenet
Then, run our repo:
python main.py --save='v0' --data_path='path-to-imagenet' --gpus='gpu-id' --batch_size=128 --alpha=8
Note that the alpha can be change to conduct ablation studies, and alpha=0 is equal to IR-Net itself.
If you find our code useful for your research, please cite our paper.
@inproceedings{
shang2022lcr,
title={Lipschitz Continuity Retained Binary Neural Network},
author={Yuzhang Shang and Dan Xu and Bin Duan and Ziliang Zong and Liqiang Nie and Yan Yan},
booktitle={ECCV},
year={2022}
}
Related Work
Our repo is modified based on the Pytorch implementations of Forward and Backward Information Retention for Accurate Binary Neural Networks (IR-Net, CVPR 2020) and Rotated Binary Neural Network (RBNN, NeurIPS 2020). Thanks to the authors for releasing their codebases!