Web: https://www.reddit.com/r/LanguageTechnology/comments/ui9em7/why_do_context_vectors_tend_to_have_dimensions/

May 4, 2022, 3:38 p.m. | /u/creamyjoshy

Natural Language Processing reddit.com

Hello. Bit of a newbie here.

For example, with an RNN, the context vector which the encoder outputs tends to have some dimensionality of size like 256, 512, 1024 etc. Why is this? Why not round numbers? I'm guessing these are somehow the most efficient sizes computationally?

Thanks!

context languagetechnology

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC