all AI news
When/Where do we require non-learnable parameters in deep learning?
Jan. 19, 2022, 2:29 a.m. | /u/Conanobrain
Deep Learning www.reddit.com
While going through the PyTorch documentation, I came across the require_grad
argument in tensors being set to True/False, wherein the author talks about scenarios where we wouldn't want the network to learn parameters. As per my knowledge, only Pooling has no parameters update, but that's because it has no parameters at all.
What are the scenarios when we have non-learnable parameters?
submitted by /u/Conanobrain[link] [comments]
More from www.reddit.com / Deep Learning
How can i visualize a CNN's architecture in this way?
1 day, 14 hours ago |
www.reddit.com
Final Year Project Ideas
5 days, 8 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Research Associate (Data Science/Information Engineering/Applied Mathematics/Information Technology)
@ Nanyang Technological University | NTU Main Campus, Singapore
Associate Director of Data Science and Analytics
@ Penn State University | Penn State University Park
Student Worker- Data Scientist
@ TransUnion | Israel - Tel Aviv
Vice President - Customer Segment Analytics Data Science Lead
@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India
Middle/Senior Data Engineer
@ Devexperts | Sofia, Bulgaria