Jan. 29, 2024, 8:11 p.m. | /u/we_are_mammals

Machine Learning www.reddit.com



https://preview.redd.it/r0h4yab5qffc1.png?width=817&format=png&auto=webp&s=033744120df49252c5379bdafa429570e80cfac4



Symbolic AI is often seen as a failure. Cyc cost $200M, as I recall (More than GPT-4's training budget?).

On the other hand, the apparent inherent limitations of Transformer LLMs \[1\] made some people look towards symbolic, neuro-symbolic and hybrid approaches again. DeepMind CEO stated that the company had half a dozen projects in this space.

If you are interested in these topics (theoretical limitations of NNs, symbolic and neuro-symbolic AI), I made a subreddit for them: …

budget ceo cost deepmind failure gpt gpt-4 hybrid limitations llms look machinelearning neuro pedro domingos people recall symbolic ai the company training transformer work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Consultant - Artificial Intelligence & Data (Google Cloud Data Engineer) - MY / TH

@ Deloitte | Kuala Lumpur, MY