Feb. 8, 2024, 5:43 a.m. | Ruiqi Xu Tianchi Zhang contributed equally to this work

cs.LG updates on arXiv.org arxiv.org

Although the computing power of mobile devices is increasing, machine learning models are also growing in size. This trend creates problems for mobile devices due to limitations like their memory capacity and battery life. While many services, like ChatGPT and Midjourney, run all the inferences in the cloud, we believe a flexible and fine-grained task distribution is more desirable. In this work, we consider model segmentation as a solution to improving the user experience, dividing the computation between mobile devices …

battery capacity chatgpt cloud computing computing power cs.ai cs.dc cs.lg devices inferences life limitations machine machine learning machine learning models memory midjourney mobile mobile computing mobile devices power services trend

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne