all AI news
Radar-Based Recognition of Static Hand Gestures in American Sign Language
Feb. 21, 2024, 5:46 a.m. | Christian Schuessler, Wenxuan Zhang, Johanna Br\"aunig, Marcel Hoffmann, Michael Stelzig, Martin Vossiek
cs.CV updates on arXiv.org arxiv.org
Abstract: In the fast-paced field of human-computer interaction (HCI) and virtual reality (VR), automatic gesture recognition has become increasingly essential. This is particularly true for the recognition of hand signs, providing an intuitive way to effortlessly navigate and control VR and HCI applications. Considering increased privacy requirements, radar sensors emerge as a compelling alternative to cameras. They operate effectively in low-light conditions without capturing identifiable human details, thanks to their lower resolution and distinct wavelength compared …
abstract american sign language applications arxiv become computer control cs.cv eess.sp gesture recognition gestures hci human human-computer interaction language privacy radar reality recognition true type virtual virtual reality
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Software Engineer, Data Tools - Full Stack
@ DoorDash | Pune, India
Senior Data Analyst
@ Artsy | New York City