all AI news
Efficient first-order algorithms for large-scale, non-smooth maximum entropy models with application to wildfire science
March 12, 2024, 4:43 a.m. | Gabriel P. Langlois, Jatan Buch, J\'er\^ome Darbon
cs.LG updates on arXiv.org arxiv.org
Abstract: Maximum entropy (Maxent) models are a class of statistical models that use the maximum entropy principle to estimate probability distributions from data. Due to the size of modern data sets, Maxent models need efficient optimization algorithms to scale well for big data applications. State-of-the-art algorithms for Maxent models, however, were not originally designed to handle big data sets; these algorithms either rely on technical devices that may yield unreliable numerical results, scale poorly, or require …
abstract algorithms application arxiv big class cs.lg cs.na data data sets entropy math.na math.oc modern optimization probability scale science statistical stat.ml type wildfire
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US