all AI news
Where's the Learning in Representation Learning for Compositional Semantics and the Case of Thematic Fit. (arXiv:2208.04749v1 [cs.CL])
Aug. 10, 2022, 1:11 a.m. | Mughilan Muthupari, Samrat Halder, Asad Sayeed, Yuval Marton
cs.CL updates on arXiv.org arxiv.org
Observing that for certain NLP tasks, such as semantic role prediction or
thematic fit estimation, random embeddings perform as well as pretrained
embeddings, we explore what settings allow for this and examine where most of
the learning is encoded: the word embeddings, the semantic role embeddings, or
``the network''. We find nuanced answers, depending on the task and its
relation to the training objective. We examine these representation learning
aspects in multi-task learning, where role prediction and role-filling are
supervised …
arxiv case learning representation representation learning semantics
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Social Insights & Data Analyst (Freelance)
@ Media.Monks | Jakarta
Cloud Data Engineer
@ Arkatechture | Portland, ME, USA