all AI news
Microsoft's Copilot AI Service Boosted by Local PC Integration and Enhanced Features
The Artificial Intelligence Podcast linktr.ee
Microsoft's Copilot AI service will soon be able to run locally on PCs, thanks to built-in neural processing units (NPUs) with over 40 trillion operations per second (TOPS) of power. By running more elements of Copilot locally, lag time will be reduced and performance and privacy may be improved. Currently, Copilot runs primarily in the cloud, causing delays for smaller tasks. Intel's Lunar Lake chips, shipping in 2025, will have triple the NPU speeds of its current chips. Microsoft is …
copilot features integration microsoft operations pcs per performance power privacy processing running service units will