all AI news
A Survey of IMU Based Cross-Modal Transfer Learning in Human Activity Recognition
March 26, 2024, 4:42 a.m. | Abhi Kamboj, Minh Do
cs.LG updates on arXiv.org arxiv.org
Abstract: Despite living in a multi-sensory world, most AI models are limited to textual and visual understanding of human motion and behavior. In fact, full situational awareness of human motion could best be understood through a combination of sensors. In this survey we investigate how knowledge can be transferred and utilized amongst modalities for Human Activity/Action Recognition (HAR), i.e. cross-modality transfer learning. We motivate the importance and potential of IMU data and its applicability in cross-modality …
abstract ai models arxiv behavior combination cs.ai cs.cv cs.lg eess.iv eess.sp human modal recognition sensors sensory situational awareness survey textual through transfer transfer learning type understanding visual world
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Consultant Senior Power BI & Azure - CDI - H/F
@ Talan | Lyon, France