June 26, 2024, 7:26 a.m. | Thomas Claburn

The Register - Software: AI + ML www.theregister.com

Watts down, doc: Boffins find machine learning models can function with more modest power requirements

Large language models can be made 50 times more energy efficient with alternative math and custom hardware, claim researchers at University of California Santa Cruz.…

alternative california claim custom energy energy efficient fpga function hardware language language models large language large language models machine machine learning machine learning models math matrix planet power requirements researchers santa save the matrix university university of california

Quantitative Researcher – Algorithmic Research

@ Man Group | GB London Riverbank House

Software Engineering Expert

@ Sanofi | Budapest

Senior Bioinformatics Scientist

@ Illumina | US - Bay Area - Foster City

Senior Engineer - Generative AI Product Engineering (Remote-Eligible)

@ Capital One | McLean, VA

Graduate Assistant - Bioinformatics

@ University of Arkansas System | University of Arkansas at Little Rock

Senior AI-HPC Cluster Engineer

@ NVIDIA | US, CA, Santa Clara