all AI news
Deep Hash Distillation for Image Retrieval. (arXiv:2112.08816v2 [cs.CV] UPDATED)
July 14, 2022, 1:12 a.m. | Young Kyun Jang, Geonmo Gu, Byungsoo Ko, Isaac Kang, Nam Ik Cho
cs.CV updates on arXiv.org arxiv.org
In hash-based image retrieval systems, degraded or transformed inputs usually
generate different codes from the original, deteriorating the retrieval
accuracy. To mitigate this issue, data augmentation can be applied during
training. However, even if augmented samples of an image are similar in real
feature space, the quantization can scatter them far away in Hamming space.
This results in representation discrepancies that can impede training and
degrade performance. In this work, we propose a novel self-distilled hashing
scheme to minimize the …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Assistant
@ World Vision | Amman Office, Jordan
Cloud Data Engineer, Global Services Delivery, Google Cloud
@ Google | Buenos Aires, Argentina