April 13, 2022, 1:12 a.m. | Lukas Galke, Ansgar Scherp

cs.LG updates on arXiv.org arxiv.org

Graph neural networks have triggered a resurgence of graph-based text
classification methods, defining today's state of the art. We show that a wide
multi-layer perceptron (MLP) using a Bag-of-Words (BoW) outperforms the recent
graph-based models TextGCN and HeteGCN in an inductive text classification
setting and is comparable with HyperGAT. Moreover, we fine-tune a
sequence-based BERT and a lightweight DistilBERT model, which both outperform
all state-of-the-art models. These results question the importance of synthetic
graphs used in modern text classifiers. In …

arxiv bag classification graph graphs mlp text text classification words

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

[Job - 14823] Senior Data Scientist (Data Analyst Sr)

@ CI&T | Brazil

Data Engineer

@ WorldQuant | Hanoi

ML Engineer / Toronto

@ Intersog | Toronto, Ontario, Canada

Analista de Business Intelligence (Industry Insights)

@ NielsenIQ | Cotia, Brazil