May 3, 2024, 4:53 a.m. | Jiaxi Li, John-Joseph Brady, Xiongjie Chen, Yunpeng Li

cs.LG updates on arXiv.org arxiv.org

arXiv:2405.01251v1 Announce Type: new
Abstract: Differentiable particle filters combine the flexibility of neural networks with the probabilistic nature of sequential Monte Carlo methods. However, traditional approaches rely on the availability of labelled data, i.e., the ground truth latent state information, which is often difficult to obtain in real-world applications. This paper compares the effectiveness of two semi-supervised training objectives for differentiable particle filters. We present results in two simulated environments where labelled data are scarce.

abstract applications arxiv availability cs.lg data differentiable filters flexibility however information nature networks neural networks paper particle semi-supervised state stat.ml supervised training training truth type world

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US