April 18, 2024, 4:44 a.m. | Dingkun Zhang, Sijia Li, Chen Chen, Qingsong Xie, Haonan Lu

cs.CV updates on arXiv.org arxiv.org

arXiv:2404.11098v1 Announce Type: new
Abstract: In the era of AIGC, the demand for low-budget or even on-device applications of diffusion models emerged. In terms of compressing the Stable Diffusion models (SDMs), several approaches have been proposed, and most of them leveraged the handcrafted layer removal methods to obtain smaller U-Nets, along with knowledge distillation to recover the network performance. However, such a handcrafting manner of layer removal is inefficient and lacks scalability and generalization, and the feature distillation employed in …

abstract aigc applications arxiv budget cs.cv demand diff diffusion diffusion models distillation laptop layer low pruning stable diffusion stable diffusion models terms them type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

#13721 - Data Engineer - AI Model Testing

@ Qualitest | Miami, Florida, United States

Elasticsearch Administrator

@ ManTech | 201BF - Customer Site, Chantilly, VA