all AI news
Help with seq2seq models with attention
June 1, 2023, 8:36 a.m. | /u/Calcifer777
Deep Learning www.reddit.com
I have the encoder embeddings with dimension (N, L\_e, E) and the decoder inputs with dimension (N, L\_d, E). N is the batch size; L\_e, L\_d are the encoder and decoder sequence lengths; and E is the embedding size.
I'm working in PyTorch; I would like to apply a nn.MultiHeadAttention layer to the encoder embeddings and pass them to the …
attention decoder deeplearning embeddings encoder implementation seq2seq
More from www.reddit.com / Deep Learning
Is DS only for people with good work experience?!?
1 day, 15 hours ago |
www.reddit.com
Need ideas for Final Year Project!
2 days, 2 hours ago |
www.reddit.com
The Vibe I get from the KAN paper
2 days, 12 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training
@ Amazon.com | Cupertino, California, USA