Sept. 28, 2022, 1:13 a.m. | A. Martina Neuman, Rongrong Wang, Yuying Xie

stat.ML updates on arXiv.org arxiv.org

We constructively show, via rigorous mathematical arguments, that GNN
architectures outperform those of NN in approximating bandlimited functions on
compact $d$-dimensional Euclidean grids. We show that the former only need
$\mathcal{M}$ sampled functional values in order to achieve a uniform
approximation error of $O_{d}(2^{-\mathcal{M}^{1/d}})$ and that this error rate
is optimal, in the sense that, NNs might achieve worse.

arxiv

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Staff Software Engineer, Generative AI, Google Cloud AI

@ Google | Mountain View, CA, USA; Sunnyvale, CA, USA

Expert Data Sciences

@ Gainwell Technologies | Any city, CO, US, 99999