all AI news
Sentiment analysis with adaptive multi-head attention in Transformer
March 12, 2024, 4:52 a.m. | Fanfei Meng, Chen-Ao Wang
cs.CL updates on arXiv.org arxiv.org
Abstract: We propose a novel framework based on the attention mechanism to identify the sentiment of a movie review document. Previous efforts on deep neural networks with attention mechanisms focus on encoder and decoder with fixed numbers of multi-head attention. Therefore, we need a mechanism to stop the attention process automatically if no more useful information can be read from the memory.In this paper, we propose an adaptive multi-head attention architecture (AdaptAttn) which varies the number …
abstract analysis arxiv attention attention mechanisms cs.cl decoder document encoder focus framework head identify movie movıe multi-head multi-head attention networks neural networks novel numbers review sentiment sentiment analysis transformer type
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada