March 29, 2024, 9:54 p.m. | /u/Error40404

Data Science www.reddit.com

If you make the weak analogy between LLMs and humans / other organisms, we find that the LLMs are trained on vast amounts of data. So much so that a human being may never process that much information over their developmental years. Of course, the brain seems to be a foundation model that has been trained over all of its ascendant's data, which could be argued to be much more than all of the internet. But I don't think the …

analogy brain course data datascience foundation human humans information llms of course process train vast

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US