all AI news
Long Range Language Modeling via Gated State Spaces. (arXiv:2206.13947v1 [cs.LG])
June 29, 2022, 1:12 a.m. | Harsh Mehta, Ankit Gupta, Ashok Cutkosky, Behnam Neyshabur
cs.CL updates on arXiv.org arxiv.org
State space models have shown to be effective at modeling long range
dependencies, specially on sequence classification tasks. In this work we focus
on autoregressive sequence modeling over English books, Github source code and
ArXiv mathematics articles. Based on recent developments around the
effectiveness of gated activation functions, we propose a new layer named Gated
State Space (GSS) and show that it trains significantly faster than the
diagonal version of S4 (i.e. DSS) on TPUs, is fairly competitive with several …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior AI & Data Engineer
@ Bertelsmann | Kuala Lumpur, 14, MY, 50400
Analytics Engineer
@ Reverse Tech | Philippines - Remote