Sept. 13, 2022, 1:15 a.m. | Thanh-Dat Truong, Chi Nhan Duong, Kha Gia Quach, Ngan Le, Tien D. Bui, Khoa Luu

cs.CV updates on arXiv.org arxiv.org

Disentangled representations have been commonly adopted to Age-invariant Face
Recognition (AiFR) tasks. However, these methods have reached some limitations
with (1) the requirement of large-scale face recognition (FR) training data
with age labels, which is limited in practice; (2) heavy deep network
architectures for high performance; and (3) their evaluations are usually taken
place on age-related face databases while neglecting the standard large-scale
FR databases to guarantee robustness. This work presents a novel Lightweight
Attentive Angular Distillation (LIAAD) approach to …

age angular arxiv distillation face face recognition scale

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Intelligence Analyst

@ Rappi | COL-Bogotá

Applied Scientist II

@ Microsoft | Redmond, Washington, United States