Feb. 15, 2024, 4:13 p.m. | /u/gggerr

Machine Learning www.reddit.com

Thought would start a thread to community brainstorm?
- do folks reckon it could just be RingAttention scaled sufficiently? c.f. https://largeworldmodel.github.io
- was it trained with 1M or 10Mn token window, that seemed unclear to me? Are they generalizing somehow?
- what datasets exist that enable training 10M text tokens window?
- how do you do RLHF on this long context? 1M text ~ 4M chars ~ 272k seconds reading time (assuming 68ms / char according to Google) ~ 75 …

community context context window datasets gemini machinelearning rlhf text thought thread token tokens training

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US