March 30, 2023, 8:43 a.m. | Sophie Henning

ΑΙhub aihub.org

Figure 1: Modern Transformer-based Natural Language Processing (NLP) methods still struggle with class imbalance: class-wise performance (second row, each dot represents one class) decreases with class frequency in training data (first row) for a variety of NLP tasks. Datasets/scores: TACRED, Zhou and Chen [2021], USPTO, Pujari et al. [2021], PDTB, Shi and Demberg [2019], UD-EWT, […]

articles chen data datasets deep dive deep learning language language processing natural natural language natural language processing nlp performance processing training training data transformer uspto

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Lead Software Engineer - Artificial Intelligence, LLM

@ OpenText | Hyderabad, TG, IN

Lead Software Engineer- Python Data Engineer

@ JPMorgan Chase & Co. | GLASGOW, LANARKSHIRE, United Kingdom

Data Analyst (m/w/d)

@ Collaboration Betters The World | Berlin, Germany

Data Engineer, Quality Assurance

@ Informa Group Plc. | Boulder, CO, United States

Director, Data Science - Marketing

@ Dropbox | Remote - Canada