April 8, 2024, 4:42 a.m. | Mohammed Ghaith Altarabichi, S{\l}awomir Nowaczyk, Sepideh Pashami, Peyman Sheikholharam Mashhadi, Julia Handl

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.03992v1 Announce Type: new
Abstract: This paper investigates how various randomization techniques impact Deep Neural Networks (DNNs). Randomization, like weight noise and dropout, aids in reducing overfitting and enhancing generalization, but their interactions are poorly understood. The study categorizes randomness techniques into four types and proposes new methods: adding noise to the loss function and random masking of gradient updates. Using Particle Swarm Optimizer (PSO) for hyperparameter optimization, it explores optimal configurations across MNIST, FASHION-MNIST, CIFAR10, and CIFAR100 datasets. Over …

abstract arxiv cs.ai cs.cv cs.lg cs.ne deep learning deep learning performance dice dropout impact interactions networks neural networks noise overfitting paper performance randomization randomness study type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne