all AI news
Sibylvariant Transformations for Robust Text Classification. (arXiv:2205.05137v1 [cs.CL])
Web: http://arxiv.org/abs/2205.05137
May 12, 2022, 1:10 a.m. | Fabrice Harel-Canada, Muhammad Ali Gulzar, Nanyun Peng, Miryung Kim
cs.CL updates on arXiv.org arxiv.org
The vast majority of text transformation techniques in NLP are inherently
limited in their ability to expand input space coverage due to an implicit
constraint to preserve the original class label. In this work, we propose the
notion of sibylvariance (SIB) to describe the broader set of transforms that
relax the label-preserving constraint, knowably vary the expected class, and
lead to significantly more diverse input distributions. We offer a unified
framework to organize all data transformations, including two types of …
More from arxiv.org / cs.CL updates on arXiv.org
Latest AI/ML/Big Data Jobs
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC
Senior Data Science Writer
@ NannyML | Remote
Director of AI/ML Engineering
@ Armis Industries | Remote (US only), St. Louis, California
Digital Analytics Manager
@ Patagonia | Ventura, California