Paper
19 October 2022 A survey of contrastive learning in NLP
Haolin Sun, Jie Liu, Jing Zhang
Author Affiliations +
Proceedings Volume 12294, 7th International Symposium on Advances in Electrical, Electronics, and Computer Engineering; 122944C (2022) https://doi.org/10.1117/12.2639685
Event: 7th International Symposium on Advances in Electrical, Electronics and Computer Engineering (ISAEECE 2022), 2022, Xishuangbanna, China
Abstract
Contrastive learning (CL) is building example pairs and computing loss to make models more robust in processing samples by reducing the distance between positive samples and amplifying the distance between negative samples. Recently, CL has attracted interest in natural language processing (NLP) where it has worked well and is mainly applied to sentence embedding and text classification tasks. To our knowledge, no study has reviewed the application of CL in NLP. In this paper, we describe two types of contexts with CL and present the methods to compute different losses. And then, we introduce some classic models which are significant. Finally, we discuss the current challenges and possible future directions of the CL.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Haolin Sun, Jie Liu, and Jing Zhang "A survey of contrastive learning in NLP", Proc. SPIE 12294, 7th International Symposium on Advances in Electrical, Electronics, and Computer Engineering, 122944C (19 October 2022); https://doi.org/10.1117/12.2639685
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Statistical modeling

Machine learning

Computer programming

Statistical analysis

Data processing

Back to Top