all AI news
BART with fine tuned GPT-2 model
May 30, 2022, 9:13 a.m. | /u/Boglbert
Natural Language Processing www.reddit.com
I was therefore thinking if it is possible to train a German or multilingual GPT-2 model for language modeling and inserting it into a BART model as the decoder.
BART always states to use a "GPT-like decoder", which made me wonder if this decoder is exchangeable. That would allow me use a GPT-2 …
More from www.reddit.com / Natural Language Processing
Which NLP-master programs in Europe are more cs-leaning?
1 day, 10 hours ago |
www.reddit.com
What do you think is the state of the art technique for matching a piece …
3 days, 8 hours ago |
www.reddit.com
Multilabel text classification on unlabled data
3 days, 21 hours ago |
www.reddit.com
AI-proof language-related jobs in the United States?
6 days, 2 hours ago |
www.reddit.com
Did we just receive an AI-generated meta-review?
1 week, 1 day ago |
www.reddit.com
Found a Way to Keep Transcripts Going 24/7
1 week, 2 days ago |
www.reddit.com
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne