all AI news
Understanding Bird's-Eye View of Road Semantics using an Onboard Camera. (arXiv:2012.03040v2 [cs.CV] UPDATED)
Jan. 17, 2022, 2:10 a.m. | Yigit Baran Can, Alexander Liniger, Ozan Unal, Danda Paudel, Luc Van Gool
cs.LG updates on arXiv.org arxiv.org
Autonomous navigation requires scene understanding of the action-space to
move or anticipate events. For planner agents moving on the ground plane, such
as autonomous vehicles, this translates to scene understanding in the
bird's-eye view (BEV). However, the onboard cameras of autonomous cars are
customarily mounted horizontally for a better view of the surrounding. In this
work, we study scene understanding in the form of online estimation of semantic
BEV maps using the video input from a single onboard camera. We …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analytics & Insight Specialist, Customer Success
@ Fortinet | Ottawa, ON, Canada
Account Director, ChatGPT Enterprise - Majors
@ OpenAI | Remote - Paris