Web: http://arxiv.org/abs/2209.10106

Sept. 22, 2022, 1:11 a.m. | Adebayo Oshingbesan, Courage Ekoh, Germann Atakpa, Yonah Byaruagaba

cs.LG updates on arXiv.org arxiv.org

Text-to-text transformers have shown remarkable success in the task of
multi-task transfer learning, especially in natural language processing (NLP).
However, while there have been several attempts to train transformers on
different domains, there is usually a clear relationship between these domains,
e.g.,, code summarization, where the natural language summary describes the
code. There have been very few attempts to study how multi-task transfer
learning works on tasks in significantly different domains. In this project, we
investigated the behavior of multi-domain, …

arxiv multi-task learning text transfer transformers

More from arxiv.org / cs.LG updates on arXiv.org

Postdoctoral Fellow: ML for autonomous materials discovery

@ Lawrence Berkeley National Lab | Berkeley, CA

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Senior Data Engineer

@ HealthVerity | United States

Data Analyst - Business Insights

@ Sertis | Bangkok

Biomarker Data Management Specialist

@ Precision Medicine Group | Remote, United States