Aug. 18, 2022, 1:11 a.m. | Haosen Ge, In Young Park, Xuancheng Qian, Grace Zeng

cs.CL updates on arXiv.org arxiv.org

High-quality text data has become an important data source for social
scientists. We have witnessed the success of pretrained deep neural network
models, such as BERT and RoBERTa, in recent social science research. In this
paper, we propose a compact pretrained deep neural network, Transformer Encoder
for Social Science (TESS), explicitly designed to tackle text processing tasks
in social science research. Using two validation tests, we demonstrate that
TESS outperforms BERT and RoBERTa by 16.7% on average when the number …

arxiv encoder science social social science transformer

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote