all AI news
NeRF-XL: Scaling NeRFs with Multiple GPUs
April 26, 2024, 4:44 a.m. | Ruilong Li, Sanja Fidler, Angjoo Kanazawa, Francis Williams
cs.CV updates on arXiv.org arxiv.org
Abstract: We present NeRF-XL, a principled method for distributing Neural Radiance Fields (NeRFs) across multiple GPUs, thus enabling the training and rendering of NeRFs with an arbitrarily large capacity. We begin by revisiting existing multi-GPU approaches, which decompose large scenes into multiple independently trained NeRFs, and identify several fundamental issues with these methods that hinder improvements in reconstruction quality as additional computational resources (GPUs) are used in training. NeRF-XL remedies these issues and enables the training …
abstract arxiv capacity cs.cv cs.dc cs.gr enabling fields fundamental gpu gpus identify multi-gpu multiple nerf neural radiance fields rendering scaling training type
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. BI Analyst
@ AkzoNobel | Pune, IN