all AI news
Predicting Issue Types with seBERT. (arXiv:2205.01335v1 [cs.SE])
Web: http://arxiv.org/abs/2205.01335
May 4, 2022, 1:11 a.m. | Alexander Trautsch, Steffen Herbold
cs.LG updates on arXiv.org arxiv.org
Pre-trained transformer models are the current state-of-the-art for natural
language models processing. seBERT is such a model, that was developed based on
the BERT architecture, but trained from scratch with software engineering data.
We fine-tuned this model for the NLBSE challenge for the task of issue type
prediction. Our model dominates the baseline fastText for all three issue types
in both recall and precisio} to achieve an overall F1-score of 85.7%, which is
an increase of 4.1% over the baseline.
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data & Insights Strategy & Innovation General Manager
@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX
Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis
@ Ahmedabad University | Ahmedabad, India
Director, Applied Mathematics & Computational Research Division
@ Lawrence Berkeley National Lab | Berkeley, Ca
Business Data Analyst
@ MainStreet Family Care | Birmingham, AL
Assistant/Associate Professor of the Practice in Business Analytics
@ Georgetown University McDonough School of Business | Washington DC
Senior Data Science Writer
@ NannyML | Remote