all AI news
SGHormer: An Energy-Saving Graph Transformer Driven by Spikes
March 27, 2024, 4:42 a.m. | Huizhe Zhang, Jintang Li, Liang Chen, Zibin Zheng
cs.LG updates on arXiv.org arxiv.org
Abstract: Graph Transformers (GTs) with powerful representation learning ability make a huge success in wide range of graph tasks. However, the costs behind outstanding performances of GTs are higher energy consumption and computational overhead. The complex structure and quadratic complexity during attention calculation in vanilla transformer seriously hinder its scalability on the large-scale graph data. Though existing methods have made strides in simplifying combinations among blocks or attention-learning paradigm to improve GTs' efficiency, a series of …
abstract arxiv attention complexity computational consumption costs cs.ai cs.lg cs.ne energy graph hinder however performances representation representation learning saving success tasks transformer transformers type vanilla transformer
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Machine Learning Engineer - Sr. Consultant level
@ Visa | Bellevue, WA, United States