all AI news
[D] performance of dropout in RNN.
April 18, 2023, 8:27 a.m. | /u/Mundane_Definition_8
Machine Learning www.reddit.com
I have done a variety of experiments for this in the layer RNN or Dense. But the most useful value was only 0, which means non-using dropout is the best option.
It depends on what kind of time series problem, but it is curious about why the approach doesn't create any good results in the …
computer computer vision dropout good gru kind lstm machinelearning performance rnn series time series value vision
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne