all AI news
Help with seq2seq models with attention
June 1, 2023, 8:36 a.m. | /u/Calcifer777
Deep Learning www.reddit.com
I have the encoder embeddings with dimension (N, L\_e, E) and the decoder inputs with dimension (N, L\_d, E). N is the batch size; L\_e, L\_d are the encoder and decoder sequence lengths; and E is the embedding size.
I'm working in PyTorch; I would like to apply a nn.MultiHeadAttention layer to the encoder embeddings and pass them to the …
attention decoder deeplearning embeddings encoder implementation seq2seq
More from www.reddit.com / Deep Learning
How can a transformer be equivariant?
2 days, 23 hours ago |
www.reddit.com
4060 ti 16gb or 4070 super 12gb?
3 days, 5 hours ago |
www.reddit.com
Is it possible to do "surgery" on a trained dataset for generative AI?
3 days, 9 hours ago |
www.reddit.com
Thoughts on New Transformer Stacking Paper
3 days, 19 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV
GN SONG MT Market Research Data Analyst 11
@ Accenture | Bengaluru, BDC7A