Oct. 7, 2022, 1:11 a.m. | Alexandre Gilotte, Ahmed Ben Yahmed, David Rohde

cs.LG updates on arXiv.org arxiv.org

Aggregating a dataset, then injecting some noise, is a simple and common way
to release differentially private data.However, aggregated data -- even without
noise -- is not an appropriate input for machine learning classifiers.In this
work, we show how a new model, similar to a logistic regression, may be learned
from aggregated data only by approximating the unobserved feature distribution
with a maximum entropy hypothesis. The resulting model is a Markov Random Field
(MRF), and we detail how to apply, …

aggregated data arxiv data entropy

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US