April 1, 2024, 4:44 a.m. | Kevin Barkevich, Reynold Bailey, Gabriel J. Diaz

cs.CV updates on arXiv.org arxiv.org

arXiv:2403.19768v1 Announce Type: new
Abstract: Algorithms for the estimation of gaze direction from mobile and video-based eye trackers typically involve tracking a feature of the eye that moves through the eye camera image in a way that covaries with the shifting gaze direction, such as the center or boundaries of the pupil. Tracking these features using traditional computer vision techniques can be difficult due to partial occlusion and environmental reflections. Although recent efforts to use machine learning (ML) for pupil …

abstract accuracy algorithms arxiv center cs.cv deep learning feature image mobile precision reality robustness through tracking type video virtual virtual reality

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote