April 21, 2023, 9:31 a.m. | /u/AggressiveCarrot4438

Deep Learning www.reddit.com

I think the dropout is a good technique to avoid overfitting and encourage kernels in the model to have similar contributions, which have similar behavior with channel attention in some perspective.

but why does research hotpot focus more on channel attention currently? I remember dropout even has been proved that has some good property.

attention become behavior deeplearning dropout focus good overfitting perspective popular property research think

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne