Web: http://arxiv.org/abs/2205.02090

May 5, 2022, 1:11 a.m. | Yifei Zhou, Yansong Feng

cs.CL updates on arXiv.org arxiv.org

Recent works show that discourse analysis benefits from modeling intra- and
inter-sentential levels separately, where proper representations for text units
of different granularities are desired to capture both the meaning of text
units and their relations to the context. In this paper, we propose to take
advantage of transformers to encode contextualized representations of units of
different levels to dynamically capture the information required for discourse
dependency analysis on intra- and inter-sentential levels. Motivated by the
observation of writing patterns …

arxiv parsing

More from arxiv.org / cs.CL updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California