all AI news
Self-supervised Implicit Glyph Attention for Text Recognition. (arXiv:2203.03382v2 [cs.CV] UPDATED)
Aug. 17, 2022, 1:12 a.m. | Tongkun Guan, Chaochen Gu, Jingzheng Tu, Xue Yang, Qi Feng, Yudi Zhao, Wei Shen
cs.CV updates on arXiv.org arxiv.org
The attention mechanism has become the de facto module in scene text
recognition (STR) methods, due to its capability of extracting character-level
representations. These methods can be summarized into implicit attention based
and supervised attention based, depended on how the attention is computed,
i.e., implicit attention and supervised attention are learned from
sequence-level text annotations and character-level bounding box annotations,
respectively. Implicit attention, as it may extract coarse or even incorrect
spatial regions as character attention, is prone to suffering …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Senior ML Researcher - 3D Geometry Processing | 3D Shape Generation | 3D Mesh Data
@ Promaton | Europe
Senior Manager, IT Ops & Service Management, AI/ML
@ Sephora | San Francisco, CA, US, 50302863
AI/ML Senior Software Engineer (Indonesia)
@ Bjak | Jakarta, Jakarta, Indonesia
Data Engineer
@ Accenture Federal Services | Laurel, MD
Principal Engineer, Deep Learning
@ Outrider | Montreal, Quebec
Consultant Data manager F/H
@ Atos | Bezons, FRANCE, FR, 95870