Web: http://arxiv.org/abs/2206.11024

June 23, 2022, 1:10 a.m. | Kassem Kallas, Teddy Furon

cs.LG updates on arXiv.org arxiv.org

Protecting the Intellectual Property rights of DNN models is of primary
importance prior to their deployment. So far, the proposed methods either
necessitate changes to internal model parameters or the machine learning
pipeline, or they fail to meet both the security and robustness requirements.
This paper proposes a lightweight, robust, and secure black-box DNN
watermarking protocol that takes advantage of cryptographic one-way functions
as well as the injection of in-task key image-label pairs during the training
process. These pairs are …

arxiv dnn

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY