May 18, 2022, 1:40 a.m. | /u/scp-8989

Natural Language Processing www.reddit.com

I aim to

1. Add an additional module to BERT architecture (huggingface’s transformers)
2. Load the BERT’s weight to the BERT model with new architecture
3. Then use BERT directly or continue train BERT

I’m very confused how to do it. Since we usually use `from_pretrained` directly to load the model (both weight and architecture) from huggingface.

In more detail, I’m working on both [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) and [bert-base-uncased](https://huggingface.co/bert-base-uncased).

architecture bert languagetechnology

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer

@ Parker | New York City

Sr. Data Analyst | Home Solutions

@ Three Ships | Raleigh or Charlotte, NC