March 12, 2024, 4:43 a.m. | Francesco De Lellis, Marco Coraggio, Nathan C. Foster, Riccardo Villa, Cristina Becchio, Mario di Bernardo

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.06557v1 Announce Type: cross
Abstract: We present a data-driven control architecture for modifying the kinematics of robots and artificial avatars to encode specific information such as the presence or not of an emotion in the movements of an avatar or robot driven by a human operator. We validate our approach on an experimental dataset obtained during the reach-to-grasp phase of a pick-and-place task.

abstract architecture artificial arxiv avatar avatars control cs.lg cs.ro cs.sy data data-driven eess.sy emotion encode human information movements robot robots type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US