March 4, 2024, 5:41 a.m. | Wei Niu, Gagan Agrawal, Bin Ren

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.00176v1 Announce Type: new
Abstract: Though many compilation and runtime systems have been developed for DNNs in recent years, the focus has largely been on static DNNs. Dynamic DNNs, where tensor shapes and sizes and even the set of operators used are dependent upon the input and/or execution, are becoming common. This paper presents SoD$^2$, a comprehensive framework for optimizing Dynamic DNNs. The basis of our approach is a classification of common operators that form DNNs, and the use of …

abstract arxiv compilation cs.ai cs.lg cs.pl deep neural network dynamic focus network neural network operators paper set systems tensor type

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote