Code for paper "Translation as Cross-Domain Knowledge: Attention Augmentation for Unsupervised Cross-Domain Segmenting and Labeling Tasks", Findings of EMNLP 2021.
To train masked Attention-Augmentation with default setting, simply do:
python3 main.py --raw