Feb. 9, 2024, 5:44 a.m. | Soroosh Tayebi Arasteh Leo Misera Jakob Nikolas Kather Daniel Truhn Sven Nebelung

cs.LG updates on arXiv.org arxiv.org

Pre-training datasets, like ImageNet, have become the gold standard in medical image analysis. However, the emergence of self-supervised learning (SSL), which leverages unlabeled data to learn robust features, presents an opportunity to bypass the intensive labeling process. In this study, we explored if SSL for pre-training on non-medical images can be applied to chest radiographs and how it compares to supervised pre-training on non-medical images and on medical images. We utilized a vision transformer and initialized its weights based on …

ai models analysis become cs.cv cs.lg data datasets eess.iv emergence features image imagenet images labeling learn medical medical ai natural network pre-training process robust scale self-supervised learning ssl standard study supervised learning training

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US