all AI news
Head and eye egocentric gesture recognition for human-robot interaction using eyewear cameras. (arXiv:2201.11500v2 [cs.CV] UPDATED)
June 13, 2022, 1:13 a.m. | Javier Marina-Miranda, V. Javier Traver
cs.CV updates on arXiv.org arxiv.org
Non-verbal communication plays a particularly important role in a wide range
of scenarios in Human-Robot Interaction (HRI). Accordingly, this work addresses
the problem of human gesture recognition. In particular, we focus on head and
eye gestures, and adopt an egocentric (first-person) perspective using eyewear
cameras. We argue that this egocentric view may offer a number of conceptual
and technical benefits over scene- or robot-centric perspectives. A
motion-based recognition approach is proposed, which operates at two temporal
granularities. Locally, frame-to-frame homographies …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Data Science Analyst- ML/DL/LLM
@ Mayo Clinic | Jacksonville, FL, United States
Machine Learning Research Scientist, Robustness and Uncertainty
@ Nuro, Inc. | Mountain View, California (HQ)