April 25, 2022, 1:11 a.m. | Xi Chen, Christos Papadimitriou, Binghui Peng

cs.LG updates on arXiv.org arxiv.org

Continual learning, or lifelong learning, is a formidable current challenge
to machine learning. It requires the learner to solve a sequence of $k$
different learning tasks, one after the other, while retaining its aptitude for
earlier tasks; the continual learner should scale better than the obvious
solution of developing and maintaining a separate learner for each of the $k$
tasks. We embark on a complexity-theoretic study of continual learning in the
PAC framework. We make novel uses of communication complexity …

arxiv continual learning memory

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne