all AI news
Zero Grads: Learning Local Surrogate Losses for Non-Differentiable Graphics
May 8, 2024, 4:43 a.m. | Michael Fischer, Tobias Ritschel
cs.LG updates on arXiv.org arxiv.org
Abstract: Gradient-based optimization is now ubiquitous across graphics, but unfortunately can not be applied to problems with undefined or zero gradients. To circumvent this issue, the loss function can be manually replaced by a ``surrogate'' that has similar minima but is differentiable. Our proposed framework, ZeroGrads, automates this process by learning a neural approximation of the objective function, which in turn can be used to differentiate through arbitrary black-box graphics pipelines. We train the surrogate on …
abstract arxiv cs.cv cs.gr cs.lg differentiable framework function gradient graphics issue loss losses optimization type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Data Architect
@ S&P Global | IN - HYDERABAD SKYVIEW
Data Architect I
@ S&P Global | US - VA - CHARLOTTESVILLE 212 7TH STREET