March 29, 2024, 6 a.m. | /u/ApartmentEither4838

Machine Learning www.reddit.com

Seq2Seq prediction architectures were designed for sequence prediction and are naturally SOTA in text generations, but are there any other non trivial tasks where we can use them? Like MeshGPT uses a gpt model for mesh generation, and diffusion transformer are also now being studied infact sora uses one. Are there many other applications where these models might be efficient and scalable ?

architectures cases diffusion gpt machinelearning mesh prediction seq2seq sora sota tasks text them transformer transformers use cases

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town