April 17, 2023, 8:02 p.m. | M. A. Simão, O. Gibaru, P. Neto

cs.LG updates on arXiv.org arxiv.org

Online recognition of gestures is critical for intuitive human-robot
interaction (HRI) and further push collaborative robotics into the market,
making robots accessible to more people. The problem is that it is difficult to
achieve accurate gesture recognition in real unstructured environments, often
using distorted and incomplete multisensory data. This paper introduces an HRI
framework to classify large vocabularies of interwoven static gestures (SGs)
and dynamic gestures (DGs) captured with wearable sensors. DG features are
obtained by applying data dimensionality reduction …

arxiv collaborative collaborative robots data dimensionality environments features framework gesture recognition human making paper people recognition robot robotics robots sensors sgs unstructured wearable wearable sensors

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineer, Data Tools - Full Stack

@ DoorDash | Pune, India

Senior Data Analyst

@ Artsy | New York City