June 14, 2023, 10:04 a.m. | /u/ArlindKadra

Deep Learning www.reddit.com

**Github:** [https://github.com/releaunifreiburg/DPL](https://github.com/releaunifreiburg/DPL)

**Paper:** [https://arxiv.org/abs/2302.00441](https://arxiv.org/abs/2302.00441)

**Abstract:**

>Hyperparameter optimization is an important subfield of machine learning that focuses on tuning the hyperparameters of a chosen algorithm to achieve peak performance. Recently, there has been a stream of methods that tackle the issue of hyperparameter optimization, however, most of the methods do not exploit the scaling law property of learning curves. In this work, we propose Deep Power Laws (DPL), an ensemble of neural network models conditioned to yield predictions that follow a …

abstract algorithm deeplearning exploit hyperparameter issue law machine machine learning optimization peak performance property scaling scaling law work

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Risk Models Methodology & IRB, Student in Nordea

@ Nordea | Stockholm, SE, 111 46