June 15, 2024, 3:46 p.m. | /u/Shenoxlenshin

Deep Learning www.reddit.com

I know that neural networks are universal approximators when given a sufficient number of neurons, but there are other things that can be universal approximators, such as a Taylor series with a high enough order.

So, my question is that, why can we not just optimize some high parameter count (or high dimensional) function instead? I am using a Taylor series just as an example, it can be any type of high dimensional function, and they all can be tuned …

deeplearning function networks neural networks neurons question series taylor things universal

Senior Data Engineer

@ Displate | Warsaw

Director of Data Science (f/m/x)

@ AUTO1 Group | Berlin, Germany

Business Intelligence Analyst I [BI Analyst I]

@ Capitec Bank | Stellenbosch, Western Cape, ZA

Data Governance Associate Director

@ Publicis Groupe | London, United Kingdom

Technical Lead - Power BI

@ Birlasoft | INDIA - PUNE - BIRLASOFT OFFICE - HINJAWADI, IN

Data Analyst

@ FirstRand Corporate Centre | 1 First Place, Cnr Simmonds & Pritchard Streets, Johannesburg, 2001