all AI news
[R] LongLoRA: New method extends LLAMA2 7B to 100k context length, 70B to 32k context length on on a single 8 × A100 machine
Sept. 22, 2023, 2:22 p.m. | /u/Successful-Western27
Machine Learning www.reddit.com
A new paper proposes [LongLoRA](https://arxiv.org/pdf/2309.12307.pdf), **a fine-tuning approach that can extend LLaMA2 7B to 100k context length and 70B model to 32k context length on a single 8× A100 machine.**
Here are my highlights from the paper:
Big one of course: LongLoRA efficiently fine-tunes large AI models on longer texts
Key points: …
32k context a100 ai models big bigger computing computing power context google highlights llama2 machine machinelearning paper power researchers resources scale them training
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US