Web: http://arxiv.org/abs/2209.09841

Sept. 21, 2022, 1:13 a.m. | Jiawei Liang, Siyuan Liang, Aishan Liu, Mingli Zhu, Danni Yuan, Chenye Xu, Xiaochun Cao

cs.CV updates on arXiv.org arxiv.org

Knowledge distillation (KD) has shown its effectiveness for object detection,
where it trains a compact object detector under the supervision of both AI
knowledge (teacher detector) and human knowledge (human expert). However,
existing studies treat the AI knowledge and human knowledge consistently and
adopt a uniform data augmentation strategy during learning, which would lead to
the biased learning of multi-scale objects and insufficient learning for the
teacher detector causing unsatisfactory distillation performance. To tackle
these problems, we propose the sample-specific …

arxiv augmentation data detection distillation knowledge

More from arxiv.org / cs.CV updates on arXiv.org

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Data Scientist (Analytics) - Singapore

@ Momos | Singapore, Central, Singapore

Machine Learning Scientist, Drug Discovery

@ Flagship Pioneering, Inc. | Cambridge, MA

Applied Scientist - Computer Vision

@ Flawless | Los Angeles, California, United States

Sr. Data Engineer, Customer Service

@ Wayfair Inc. | Boston, MA