April 27, 2023, 1 p.m. | Payal Dhar

IEEE Spectrum spectrum.ieee.org



In the realm of artificial intelligence, bigger is supposed to be better. Neural networks with billions of parameters power everyday AI-based tools like ChatGPT and Dall-E, and each new large language model (LLM) edges out its predecessors in size and complexity. Meanwhile, at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), a group of researchers have been working on going small.

In recent research, they demonstrated the efficiency of a new kind of very small—20,000 parameter—machine learning …

ai artificial artificial intelligence bigger chatgpt complexity computer computer science dall dall-e drones efficiency everyday ai explainable ai intelligence kind laboratory language language model large language model llm machine machine learning mit navigation network networks neural network neural networks power research researchers robot ai robotics science small tools

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US