all AI news
Reducing Down(stream)time: Pretraining Molecular GNNs using Heterogeneous AI Accelerators. (arXiv:2211.04598v1 [cs.LG])
Nov. 10, 2022, 2:11 a.m. | Jenna A. Bilbrey, Kristina M. Herman, Henry Sprueill, Soritis S. Xantheas, Payel Das, Manuel Lopez Roldan, Mike Kraus, Hatem Helal, Sutanay Choudhury
cs.LG updates on arXiv.org arxiv.org
The demonstrated success of transfer learning has popularized approaches that
involve pretraining models from massive data sources and subsequent finetuning
towards a specific task. While such approaches have become the norm in fields
such as natural language processing, implementation and evaluation of transfer
learning approaches for chemistry are in the early stages. In this work, we
demonstrate finetuning for downstream tasks on a graph neural network (GNN)
trained over a molecular database containing 2.7 million water clusters. The
use of …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US