all AI news
Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence
April 24, 2024, 4:43 a.m. | Riccardo Bonalli, Alessandro Rudi
cs.LG updates on arXiv.org arxiv.org
Abstract: We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of multi-dimensional non-linear stochastic differential equations, which relies upon discrete-time observations of the state. The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations, yielding theoretical estimates of non-asymptotic learning rates which, unlike previous works, become increasingly tighter when the regularity of the unknown drift and diffusion coefficients becomes higher. Our method …
abstract approximation arxiv convergence cs.lg cs.sy differential diffusion drift eess.sy identification key linear math.oc non-linear non-parametric novel paradigm parametric state stochastic the key type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York