all AI news
LongLoRA: New method extends LLAMA2 7B to 100k context length, 70B to 32k context length on on a single 8 × A100 machine
Sept. 22, 2023, 2:20 p.m. | /u/Successful-Western27
Artificial Intelligence www.reddit.com
A new paper proposes [LongLoRA](https://arxiv.org/pdf/2309.12307.pdf), **a fine-tuning approach that can extend LLaMA2 7B to 100k context length and 70B model to 32k context length on a single 8× A100 machine.**
Here are my highlights from the paper:
Big one of course: LongLoRA efficiently fine-tunes large AI models on longer texts
Key points: …
32k context a100 ai models artificial big bigger computing computing power context google highlights llama2 machine paper power researchers resources scale them training
More from www.reddit.com / Artificial Intelligence
Researchers Train AI Doctors In Hospital Simulation
2 days, 2 hours ago |
www.reddit.com
Instagram Co-Founder Joins Anthropic
2 days, 10 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US