April 19, 2024, 5:06 p.m. | /u/Few-Pomegranate4369

Machine Learning www.reddit.com

Hey everyone! It looks like in a few years, the basic large language models (LLMs) we use will get commoditised, and it won't really matter which one you pick. The next big thing could be LLMs that use Retrieval-Augmented Generation (RAG), which means they need a ton of data to work well.

Given that Google has access to loads of data through its search engine, do you think they're in a better position to lead in this new phase compared …

basic big data google hey language language models large language large language models llms machinelearning massive matter next rag resources retrieval retrieval-augmented set will

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Scientist

@ ITE Management | New York City, United States