Dec. 18, 2023, 5 p.m. | Last Week in AI

Last Week in AI lastweekin.ai

The small 2.7B Phi-2 model is surprisingly strong, Mistral releases MoE model, Mamba may replace transformers for better performance, cheap deepfakes ran amok in election campaigns

campaigns deepfakes election mamba microsoft mistral mixtral mixtral 8x7b moe performance phi phi-2 releases small transformers

More from lastweekin.ai / Last Week in AI

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US