all AI news
Predicting Issue Types with seBERT. (arXiv:2205.01335v1 [cs.SE])
Pre-trained transformer models are the current state-of-the-art for natural
language models processing. seBERT is such a model, that was developed based on
the BERT architecture, but trained from scratch with software engineering data.
We fine-tuned this model for the NLBSE challenge for the task of issue type
prediction. Our model dominates the baseline fastText for all three issue types
in both recall and precisio} to achieve an overall F1-score of 85.7%, which is
an increase of 4.1% over the baseline.