all AI news
Domain-aware Self-supervised Pre-training for Label-Efficient Meme Analysis. (arXiv:2209.14667v1 [cs.CL])
Sept. 30, 2022, 1:16 a.m. | Shivam Sharma, Mohd Khizir Siddiqui, Md. Shad Akhtar, Tanmoy Chakraborty
cs.CL updates on arXiv.org arxiv.org
Existing self-supervised learning strategies are constrained to either a
limited set of objectives or generic downstream tasks that predominantly target
uni-modal applications. This has isolated progress for imperative multi-modal
applications that are diverse in terms of complexity and domain-affinity, such
as meme analysis. Here, we introduce two self-supervised pre-training methods,
namely Ext-PIE-Net and MM-SimCLR that (i) employ off-the-shelf multi-modal
hate-speech data during pre-training and (ii) perform self-supervised learning
by incorporating multiple specialized pretext tasks, effectively catering to
the required complex …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada