March 25, 2024, 4:23 a.m. | /u/NuseAI

Artificial Intelligence www.reddit.com

- Apple researchers are investigating the use of AI to identify when a user is speaking to a device without requiring a trigger phrase like 'Siri'.

- A study involved training a large language model using speech and acoustic data to detect patterns indicating the need for assistance from the device.

- The model showed promising results, outperforming audio-only or text-only models as its size increased.

- Eliminating the 'Hey Siri' prompt could raise concerns about privacy and constant listening …

apple artificial data explore identify language language model large language large language model patterns researchers siri speaking speech study training

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne