July 6, 2022, 1:10 a.m. | Yue Song, Nicu Sebe, Wei Wang

cs.LG updates on arXiv.org arxiv.org

Inserting an SVD meta-layer into neural networks is prone to make the
covariance ill-conditioned, which could harm the model in the training
stability and generalization abilities. In this paper, we systematically study
how to improve the covariance conditioning by enforcing orthogonality to the
Pre-SVD layer. Existing orthogonal treatments on the weights are first
investigated. However, these techniques can improve the conditioning but would
hurt the performance. To avoid such a side effect, we propose the Nearest
Orthogonal Gradient (NOG) and …

arxiv covariance cv meta svd

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US