May 30, 2022, 9:13 a.m. | /u/Boglbert

Natural Language Processing www.reddit.com

Hi, I am currently working on German Abstractive Summarisation. My goal is to have a custom style abstractive summarisation model that learned a certain style of summarisation.

I was therefore thinking if it is possible to train a German or multilingual GPT-2 model for language modeling and inserting it into a BART model as the decoder.

BART always states to use a "GPT-like decoder", which made me wonder if this decoder is exchangeable. That would allow me use a GPT-2 …

bart gpt gpt-2 languagetechnology

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne