April 23, 2024, 10:59 p.m. | Dr. Tony Hoang

The Artificial Intelligence Podcast linktr.ee

The Ray-Ban Meta Smart Glasses now have multimodal AI, which enables the glasses' AI assistant to process photos, audio, and text. This feature was added after an early access program. The glasses' primary command is "Hey Meta, look and..." followed by the desired task. They can identify plants, read signs in different languages, write Instagram captions, and provide information about landmarks. However, the AI is not always accurate and has limitations. The glasses function with minimal wait time as they …

access ai assistant assistant audio ban command feature glasses hey identify look meta meta smart glasses multimodal multimodal ai photos plants process ray ray-ban smart smart glasses tech text wearable wearable tech

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)

@ takealot.com | Cape Town