April 17, 2024, 1:40 p.m. | /u/LelouchZer12

Machine Learning www.reddit.com

Hi

I am aware that a lot of transformer encoder variations exist (BERT, DistilBERT, Deberta, Roberta ...).

However I am not interested in the best ones (that should probably be Deberta V3) but rather the ones that can quickly have decent results even with very few example examples (like \~50,100 sentences each containing maybe 1, 2 or 3 entities).

* I have done a few experiments in **english**, and to my surprise it seems that the one that perform best …

bert data distilbert encoder example finetuning however low machinelearning ner nlp ones results roberta transformer transformer encoder

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US