May 22, 2023, 11:26 a.m. | /u/PepperGrind

Deep Learning www.reddit.com

Hi All,

I took this architecture diagram from a mock exam (masters level):

​

https://preview.redd.it/sly99gdyad1b1.png?width=623&format=png&auto=webp&v=enabled&s=1ce14ee783107e68b6d7c5eb77f0a57a1e691e53

I find the "L3" layer very odd; the one that says "MLP". I know it stands for "Multi Layer Perceptron", which is basically an old-fashioned term for feed forward network. The thing that confuses me is that it squeezes the dimensions from 512 to 1. Why? I thought perhaps it does it because it's a binary classification problem, but that doesn't make sense because of …

architecture deeplearning exam mlp mock multi layer perceptron network perceptron rnn

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South