Aug. 10, 2022, 1:11 a.m. | Mughilan Muthupari, Samrat Halder, Asad Sayeed, Yuval Marton

cs.CL updates on arXiv.org arxiv.org

Observing that for certain NLP tasks, such as semantic role prediction or
thematic fit estimation, random embeddings perform as well as pretrained
embeddings, we explore what settings allow for this and examine where most of
the learning is encoded: the word embeddings, the semantic role embeddings, or
``the network''. We find nuanced answers, depending on the task and its
relation to the training objective. We examine these representation learning
aspects in multi-task learning, where role prediction and role-filling are
supervised …

arxiv case learning representation representation learning semantics

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Social Insights & Data Analyst (Freelance)

@ Media.Monks | Jakarta

Cloud Data Engineer

@ Arkatechture | Portland, ME, USA