all AI news
SDiT: Spiking Diffusion Model with Transformer
Feb. 20, 2024, 5:47 a.m. | Shu Yang, Hanzhi Ma, Chengting Yu, Aili Wang, Er-Ping Li
cs.CV updates on arXiv.org arxiv.org
Abstract: Spiking neural networks (SNNs) have low power consumption and bio-interpretable characteristics, and are considered to have tremendous potential for energy-efficient computing. However, the exploration of SNNs on image generation tasks remains very limited, and a unified and effective structure for SNN-based generative models has yet to be proposed. In this paper, we explore a novel diffusion model architecture within spiking neural networks. We utilize transformer to replace the commonly used U-net structure in mainstream diffusion …
abstract arxiv bio computing consumption cs.ai cs.cv diffusion diffusion model energy exploration generative generative models image image generation low low power networks neural networks power power consumption snn spiking neural networks tasks transformer type
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Data Engineer
@ Cint | Gurgaon, India
Data Science (M/F), setor automóvel - Aveiro
@ Segula Technologies | Aveiro, Portugal