March 28, 2024, 9:05 p.m. | Dr. Tony Hoang

The Artificial Intelligence Podcast linktr.ee

Microsoft's Copilot AI service will soon be able to run locally on PCs, thanks to built-in neural processing units (NPUs) with over 40 trillion operations per second (TOPS) of power. By running more elements of Copilot locally, lag time will be reduced and performance and privacy may be improved. Currently, Copilot runs primarily in the cloud, causing delays for smaller tasks. Intel's Lunar Lake chips, shipping in 2025, will have triple the NPU speeds of its current chips. Microsoft is …

copilot features integration microsoft operations pcs per performance power privacy processing running service units will

More from linktr.ee / The Artificial Intelligence Podcast

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Research Scientist

@ Meta | Menlo Park, CA

Principal Data Scientist

@ Mastercard | O'Fallon, Missouri (Main Campus)