April 17, 2023, 8:22 p.m. | Ngoc Dang Nguyen, Wei Tan, Wray Buntine, Richard Beare, Changyou Chen, Lan Du

cs.CL updates on arXiv.org arxiv.org

Current work in named entity recognition (NER) uses either cross entropy (CE)
or conditional random fields (CRF) as the objective/loss functions to optimize
the underlying NER model. Both of these traditional objective functions for the
NER problem generally produce adequate performance when the data distribution
is balanced and there are sufficient annotated training examples. But since NER
is inherently an imbalanced tagging problem, the model performance under the
low-resource settings could suffer using these standard objective functions.
Based on recent …

arxiv auc data distribution entropy examples fields loss low ner performance random recognition roc standard tagging training work

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote