Web: http://arxiv.org/abs/2205.01893

May 5, 2022, 1:12 a.m. | Rishikesh Magar, Yuyang Wang, Amir Barati Farimani

cs.LG updates on arXiv.org arxiv.org

Machine learning (ML) models have been widely successful in the prediction of
material properties. However, large labeled datasets required for training
accurate ML models are elusive and computationally expensive to generate.
Recent advances in Self-Supervised Learning (SSL) frameworks capable of
training ML models on unlabeled data have mitigated this problem and
demonstrated superior performance in computer vision and natural language
processing tasks. Drawing inspiration from the developments in SSL, we
introduce Crystal Twins (CT): an SSL method for crystalline materials …

arxiv learning prediction self-supervised learning supervised learning

More from arxiv.org / cs.LG updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California