April 20, 2023, 6:37 p.m. | /u/Lewenhart87

Deep Learning www.reddit.com

Sam Altaman from OpenAI [just said](https://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/) that the next AI models will not be larger (more parameters) than the current best ones. GPT4 for example is 1T parameters.
Why?
Main reasons:
* The larger the models the more data you need. There is simply not enough data (in the world) to train these super large models.
* The larger the model the harder is the ‘understanding of its output’. Hence higher risks or wrong results.
* Top scientists believe General …

architectures data deeplearning example general general ai human human intelligence intelligence large models network neural network risks scientists understanding world yann lecun

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US