Web: http://arxiv.org/abs/2201.11624

Jan. 28, 2022, 2:11 a.m. | Nelly Elsayed, Zag ElSayed, Anthony S. Maida

cs.LG updates on arXiv.org arxiv.org

Long short-term memory (LSTM) is a robust recurrent neural network
architecture for learning spatiotemporal sequential data. However, it requires
significant computational power for learning and implementing from both
software and hardware aspects. This paper proposes a novel LiteLSTM
architecture based on reducing the computation components of the LSTM using the
weights sharing concept to reduce the overall architecture cost and maintain
the architecture performance. The proposed LiteLSTM can be significant for
learning big data where time-consumption is crucial such as …

architecture arxiv deep networks neural neural networks

Engineering Manager, Machine Learning (Credit Engineering)

@ Affirm | Remote Poland

Sr Data Engineer

@ Rappi | [CO] Bogotá

Senior Analytics Engineer

@ GetGround | Porto

Senior Staff Software Engineer, Data Engineering

@ Galileo, Inc. | New York City or Remote

Data Engineer

@ Atlassian | Bengaluru, India

Data Engineer | Hybrid (Pune)

@ Velotio | Pune, Maharashtra, India