May 9, 2024, 4:42 a.m. | Mu Yuan, Lan Zhang, Xiang-Yang Li

cs.LG updates on arXiv.org arxiv.org

arXiv:2312.00025v2 Announce Type: replace-cross
Abstract: Security of model parameters and user data is critical for Transformer-based services, such as ChatGPT. While recent strides in secure two-party protocols have successfully addressed security concerns in serving Transformer models, their adoption is practically infeasible due to the prohibitive cryptographic overheads involved. Drawing insights from our hands-on experience in developing two real-world Transformer-based services, we identify the inherent efficiency bottleneck in the two-party assumption. To overcome this limitation, we propose a novel three-party threat …

abstract adoption arxiv chatgpt concerns cs.cr cs.lg data experience hands-on experience inference insights parameters protocol security security concerns services transformer transformer models type user data while

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Principal Research Engineer - Materials

@ GKN Aerospace | Westlake, TX, US

Internship in Data and Projects

@ Bosch Group | Mechelen, Belgium

Research Scientist- Applied Mechanics

@ Corning | Pune, MH, IN, 410501

Product Data Analyst

@ AUTODOC | Lisbon-remote