Jan. 4, 2022, 2:10 a.m. | Radostin Cholakov, Todor Kolev

cs.LG updates on arXiv.org arxiv.org

There is an increasing interest in the application of deep learning
architectures to tabular data. One of the state-of-the-art solutions is
TabTransformer which incorporates an attention mechanism to better track
relationships between categorical features and then makes use of a standard MLP
to output its final logits. In this paper we propose multiple modifications to
the original TabTransformer performing better on binary classification tasks
for three separate datasets with more than 1% AUROC gains. Inspired by gated
MLP, linear projections …

architecture arxiv deep learning learning modeling

Senior Marketing Data Analyst

@ Amazon.com | Amsterdam, North Holland, NLD

Senior Data Analyst

@ MoneyLion | Kuala Lumpur, Kuala Lumpur, Malaysia

Data Management Specialist - Office of the CDO - Chase- Associate

@ JPMorgan Chase & Co. | LONDON, LONDON, United Kingdom

BI Data Analyst

@ Nedbank | Johannesburg, ZA

Head of Data Science and Artificial Intelligence (m/f/d)

@ Project A Ventures | Munich, Germany

Senior Data Scientist - GenAI

@ Roche | Hyderabad RSS