April 23, 2024, 10:59 p.m. | Dr. Tony Hoang

The Artificial Intelligence Podcast linktr.ee

The Ray-Ban Meta Smart Glasses now have multimodal AI, which enables the glasses' AI assistant to process photos, audio, and text. This feature was added after an early access program. The glasses' primary command is "Hey Meta, look and..." followed by the desired task. They can identify plants, read signs in different languages, write Instagram captions, and provide information about landmarks. However, the AI is not always accurate and has limitations. The glasses function with minimal wait time as they …

access ai assistant assistant audio ban command feature glasses hey identify look meta meta smart glasses multimodal multimodal ai photos plants process ray ray-ban smart smart glasses tech text wearable wearable tech

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US