all AI news
SDiT: Spiking Diffusion Model with Transformer
Feb. 20, 2024, 5:47 a.m. | Shu Yang, Hanzhi Ma, Chengting Yu, Aili Wang, Er-Ping Li
cs.CV updates on arXiv.org arxiv.org
Abstract: Spiking neural networks (SNNs) have low power consumption and bio-interpretable characteristics, and are considered to have tremendous potential for energy-efficient computing. However, the exploration of SNNs on image generation tasks remains very limited, and a unified and effective structure for SNN-based generative models has yet to be proposed. In this paper, we explore a novel diffusion model architecture within spiking neural networks. We utilize transformer to replace the commonly used U-net structure in mainstream diffusion …
abstract arxiv bio computing consumption cs.ai cs.cv diffusion diffusion model energy exploration generative generative models image image generation low low power networks neural networks power power consumption snn spiking neural networks tasks transformer type
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US