Feb. 22, 2024, 5:43 a.m. | Aaron Lou, Chenlin Meng, Stefano Ermon

cs.LG updates on arXiv.org arxiv.org

arXiv:2310.16834v2 Announce Type: replace-cross
Abstract: Despite their groundbreaking performance for many generative modeling tasks, diffusion models have fallen short on discrete data domains such as natural language. Crucially, standard diffusion models rely on the well-established theory of score matching, but efforts to generalize this to discrete structures have not yielded the same empirical gains. In this work, we bridge this gap by proposing score entropy, a novel loss that naturally extends score matching to discrete spaces, integrates seamlessly to build …

abstract arxiv cs.cl cs.lg data diffusion diffusion modeling diffusion models distribution domains generative generative modeling groundbreaking language modeling natural natural language performance standard stat.ml tasks theory type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US