all AI news
Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction. (arXiv:2205.06672v2 [cs.CV] UPDATED)
cs.CV updates on arXiv.org arxiv.org
Classical multiple instance learning (MIL) methods are often based on the
identical and independent distributed assumption between instances, hence
neglecting the potentially rich contextual information beyond individual
entities. On the other hand, Transformers with global self-attention modules
have been proposed to model the interdependencies among all instances. However,
in this paper we question: Is global relation modeling using self-attention
necessary, or can we appropriately restrict self-attention calculations to
local regimes in large-scale whole slide images (WSIs)? We propose a
general-purpose …
arxiv attention cv graph graph-based local attention prediction transformer