all AI news
Less Is More - On the Importance of Sparsification for Transformers and Graph Neural Networks for TSP
March 27, 2024, 4:41 a.m. | Attila Lischka, Jiaming Wu, Rafael Basso, Morteza Haghir Chehreghani, Bal\'azs Kulcs\'ar
cs.LG updates on arXiv.org arxiv.org
Abstract: Most of the recent studies tackling routing problems like the Traveling Salesman Problem (TSP) with machine learning use a transformer or Graph Neural Network (GNN) based encoder architecture. However, many of them apply these encoders naively by allowing them to aggregate information over the whole TSP instances. We, on the other hand, propose a data preprocessing method that allows the encoders to focus on the most relevant parts of the TSP instances only. In particular, …
abstract apply architecture arxiv cs.ai cs.lg encoder gnn graph graph neural network graph neural networks however importance machine machine learning network networks neural network neural networks routing studies them transformer transformers type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Robotics Technician - 3rd Shift
@ GXO Logistics | Perris, CA, US, 92571