Oct. 14, 2022, 1:11 a.m. | Yi Zeng, Minzhou Pan, Himanshu Jahagirdar, Ming Jin, Lingjuan Lyu, Ruoxi Jia

cs.LG updates on arXiv.org arxiv.org

Given the volume of data needed to train modern machine learning models,
external suppliers are increasingly used. However, incorporating external data
poses data poisoning risks, wherein attackers manipulate their data to degrade
model utility or integrity. Most poisoning defenses presume access to a set of
clean data (or base set). While this assumption has been taken for granted,
given the fast-growing research on stealthy poisoning attacks, a question
arises: can defenders really identify a clean subset within a contaminated
dataset …

arxiv clean data data

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US