Web: http://arxiv.org/abs/2205.02979

May 9, 2022, 1:10 a.m. | Arijit Sehanobish, McCullen Sandora, Nabila Abraham, Jayashri Pawar, Danielle Torres, Anasuya Das, Murray Becker, Richard Herzog, Benjamin Odry, Ron V

cs.CL updates on arXiv.org arxiv.org

Pretrained Transformer based models finetuned on domain specific corpora have
changed the landscape of NLP. However, training or fine-tuning these models for
individual tasks can be time consuming and resource intensive. Thus, a lot of
current research is focused on using transformers for multi-task learning
(Raffel et al.,2020) and how to group the tasks to help a multi-task model to
learn effective representations that can be shared across tasks (Standley et
al., 2020; Fifty et al., 2021). In this work, …

arxiv extraction knowledge learning multi-task learning reports

More from arxiv.org / cs.CL updates on arXiv.org

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote