April 1, 2024, 3:12 a.m. | /u/Seankala

Machine Learning www.reddit.com

I recently made custom BERT and ELECTRA models for the fashion domain that could also handle English and my own native language (I'm not in the US). I noticed that performance wasn't as good as I anticipated and felt that it wasn't worth it.

Are there any papers or resources regarding when it's worth it to create your own pre-trained LM from scratch? I recall reading a paper for the biomedical domain a long time ago titled [_Pretrained Language Models …

bert domain encoder english fashion good language language model machinelearning performance text

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US