April 17, 2024, 4:41 a.m. | Kafeng Wang, Jianfei Chen, He Li, Zhenpeng Mi, Jun Zhu

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.10445v1 Announce Type: new
Abstract: Diffusion models have been extensively used in data generation tasks and are recognized as one of the best generative models. However, their time-consuming deployment, long inference time, and requirements on large memory limit their application on mobile devices. In this paper, we propose a method based on the improved Straight-Through Estimator to improve the deployment efficiency of diffusion models. Specifically, we add sparse masks to the Convolution and Linear layers in a pre-trained diffusion model, …

abstract application arxiv cs.ai cs.lg data deployment devices diffusion diffusion models generative generative models however inference memory mobile mobile devices paper requirements tasks type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Data Analyst

@ Alstom | Johannesburg, GT, ZA