Dec. 2, 2023, 1:23 p.m. | /u/30299578815310

Machine Learning www.reddit.com

The bitter lesson argues learning and search are the winning strategies since they scale with computing power, and will generally outperform techniques that rely on encoding human knowledge.

What do you think of ToT and similar techniques given this? Are they good examples of expanding the power of models via search, or are they examples of attempting to force what we believe to be humanlike behavior?



For reference, I've seen two main tree-based approaches used in the research. One …

computing computing power encoding examples human humanlike knowledge machinelearning power scale search strategies think thoughts tree will

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne