all AI news
Rethinking and Improving Natural Language Generation with Layer-Wise Multi-View Decoding. (arXiv:2005.08081v7 [cs.CL] UPDATED)
cs.LG updates on arXiv.org arxiv.org
In sequence-to-sequence learning, e.g., natural language generation, the
decoder relies on the attention mechanism to efficiently extract information
from the encoder. While it is common practice to draw information from only the
last encoder layer, recent work has proposed to use representations from
different encoder layers for diversified levels of information. Nonetheless,
the decoder still obtains only a single view of the source sequences, which
might lead to insufficient training of the encoder layer stack due to the
hierarchy bypassing …
arxiv generation language language generation natural natural language natural language generation