Sept. 18, 2022, 4:36 a.m. | /u/kastilyo

Natural Language Processing www.reddit.com

Hello all!

I have been reading about methodologies to better understand fine tuning with BERT and came across this paper.

"A Fine-Tuned BERT-Based Transfer Learning Approach for Text Classification" https://www.hindawi.com/journals/jhe/2022/3498123/

I was wondering if someone could help set my head straight on Encoding methods. Particularly, section 4.3 Encoding and the "Encoding" column of Figure 1.

I have not seen this approach in fine-tuning for text classifications before. For example, when I see applications using huggingfaces BertTokenizer, it tokenizes, converts to …

encoding languagetechnology

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Analytics & Insight Specialist, Customer Success

@ Fortinet | Ottawa, ON, Canada

Account Director, ChatGPT Enterprise - Majors

@ OpenAI | Remote - Paris