Github page for paper CoMave: Contrastive Pre-training with Multi-scale Masking for Attribute Value Extraction accepted by Findings of ACL 2023
We open release CoMave in both Large-Chinese version and Large-English version, which continually pre-trained from RoBERTa with tens of millions of data.
This paper constructs experiments on four benchmark: INS, MEPAVE, AE-Pub, MAE.
- You can download AE-Pub and MAE by here.
- MEPAVE is build by the team of JD platform, please contact the author to get dataset.
- INS is collected from the real business product data of Alipay platform. For the sake of data privacy, please contact us though email to get dataset.