Oct. 27, 2022, 1:15 a.m. | Nitin Bansal, Pan Ji, Junsong Yuan, Yi Xu

cs.CV updates on arXiv.org arxiv.org

Multi-task learning (MTL) paradigm focuses on jointly learning two or more
tasks, aiming for significant improvement w.r.t model's generalizability,
performance, and training/inference memory footprint. The aforementioned
benefits become ever so indispensable in the case of joint training for
vision-related {\bf dense} prediction tasks. In this work, we tackle the MTL
problem of two dense tasks, i.e., semantic segmentation and depth estimation,
and present a novel attention module called Cross-Channel Attention Module
({CCAM}), which facilitates effective feature sharing along each channel …

arxiv semantics semi-supervised semi-supervised learning supervised learning

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Data Engineer

@ Bosch Group | San Luis Potosí, Mexico

DATA Engineer (H/F)

@ Renault Group | FR REN RSAS - Le Plessis-Robinson (Siège)

Advisor, Data engineering

@ Desjardins | 1, Complexe Desjardins, Montréal

Data Engineer Intern

@ Getinge | Wayne, NJ, US

Software Engineer III- Java / Python / Pyspark / ETL

@ JPMorgan Chase & Co. | Jersey City, NJ, United States