July 11, 2022, 1:11 a.m. | Ali Abbasi, Parsa Nooralinejad, Vladimir Braverman, Hamed Pirsiavash, Soheil Kolouri

cs.LG updates on arXiv.org arxiv.org

Continual/lifelong learning from a non-stationary input data stream is a
cornerstone of intelligence. Despite their phenomenal performance in a wide
variety of applications, deep neural networks are prone to forgetting their
previously learned information upon learning new ones. This phenomenon is
called "catastrophic forgetting" and is deeply rooted in the
stability-plasticity dilemma. Overcoming catastrophic forgetting in deep neural
networks has become an active field of research in recent years. In particular,
gradient projection-based methods have recently shown exceptional performance
at …

arxiv continual dropout learning lg null space sparsity

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Data Engineering Manager

@ Microsoft | Redmond, Washington, United States

Machine Learning Engineer

@ Apple | San Diego, California, United States