May 30, 2022, 9:13 a.m. | /u/Boglbert

Natural Language Processing www.reddit.com

Hi, I am currently working on German Abstractive Summarisation. My goal is to have a custom style abstractive summarisation model that learned a certain style of summarisation.

I was therefore thinking if it is possible to train a German or multilingual GPT-2 model for language modeling and inserting it into a BART model as the decoder.

BART always states to use a "GPT-like decoder", which made me wonder if this decoder is exchangeable. That would allow me use a GPT-2 …

bart gpt gpt-2 languagetechnology

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US