all AI news
HuggingFace q - RoBERTa equivalent of BERTForPretraining?
Oct. 31, 2022, 3:10 a.m. | /u/akardashian
Natural Language Processing www.reddit.com
In short, I am trying to initialize a model from a pre-trained encoder and its LM head. For BERT, I can use BertModel and grab the LM head from BERTForPretraining. I am trying to do the same for RoBERTa, but there is no equivalent RoBERTaForPretraining from HF. I am thinking about grabbing the LM head from RoBERTaForMaskedLM instead, but I am not sure if it would be compatible with the RoBERTa …
More from www.reddit.com / Natural Language Processing
Multilabel text classification on unlabled data
2 days, 13 hours ago |
www.reddit.com
AI-proof language-related jobs in the United States?
4 days, 18 hours ago |
www.reddit.com
Anyone working on mathematics of transformers?
1 week, 2 days ago |
www.reddit.com
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Data Engineer (m/f/d)
@ Project A Ventures | Berlin, Germany
Principle Research Scientist
@ Analog Devices | US, MA, Boston