Feb. 13, 2024, 5:41 a.m. | Ruiyang Qin Yuting Hu Zheyu Yan Jinjun Xiong Ahmed Abbasi Yiyu Shi

cs.LG updates on arXiv.org arxiv.org

Neural Architecture Search (NAS) has become the de fecto tools in the industry in automating the design of deep neural networks for various applications, especially those driven by mobile and edge devices with limited computing resources. The emerging large language models (LLMs), due to their prowess, have also been incorporated into NAS recently and show some promising results. This paper conducts further exploration in this direction by considering three important design metrics simultaneously, i.e., model accuracy, fairness, and hardware deployment …

applications architecture become computing computing resources cs.ai cs.lg design devices edge edge devices fairness industry language language models large language large language models llms mobile nas networks neural architecture search neural networks resources search tools via

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Risk Management - Machine Learning and Model Delivery Services, Product Associate - Senior Associate-

@ JPMorgan Chase & Co. | Wilmington, DE, United States

Senior ML Engineer (Speech/ASR)

@ ObserveAI | Bengaluru