Feb. 29, 2024, 8:20 a.m. | Nate Cibik

Towards Data Science - Medium towardsdatascience.com

The Evolution of Model Compression in the LLM Era

Image by author using DALL-E 3.

The advent of transformers in 2017 set off a landslide of AI milestones, starting with the spectacular achievements of large language models (LLMs) in natural language processing (NLP), and quickly catalyzing advancement in other domains such as computer vision and robotics. The unification of NLP and computer vision problems into a common architecture accelerated efforts in learning joint vision-language representation spaces, which enabled the …

advancement author compression computer computer vision dall dall-e dall-e 3 deep learning domains editors pick evolution language language models language processing large language large language models llm llms milestones natural natural language natural language processing neural networks nlp optimization processing robotics set transformers unification vision

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Machine Learning Research Scientist

@ d-Matrix | San Diego, Ca