Sept. 28, 2022, 1:13 a.m. | Matteo Giordano, Kolyan Ray, Johannes Schmidt-Hieber

stat.ML updates on arXiv.org arxiv.org

We rigorously prove that deep Gaussian process priors can outperform Gaussian
process priors if the target function has a compositional structure. To this
end, we study information-theoretic lower bounds for posterior contraction
rates for Gaussian process regression in a continuous regression model. We show
that if the true function is a generalized additive function, then the
posterior based on any mean-zero Gaussian process can only recover the truth at
a rate that is strictly slower than the minimax rate by …

arxiv learn process regression

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Analyst (CPS-GfK)

@ GfK | Bucharest

Consultant Data Analytics IT Digital Impulse - H/F

@ Talan | Paris, France

Data Analyst

@ Experian | Mumbai, India

Data Scientist

@ Novo Nordisk | Princeton, NJ, US

Data Architect IV

@ Millennium Corporation | United States