“Hey Meta”: New AI features reach Meta’s Ray-Ban


As Google Starting to revive the Google Glass conceptMeta is already one step ahead with something new artificial intelligence This summer’s feature comes to your glasses. Ray-Ban Smart GlassesWe have partnered with Meta to get some powerful AI updates for us and our Canadian users.

Operate the Metaview app Connected smartphoneRay-Ban users Smart Glasses You can also use the “Hey Meta, Start Live AI” command to provide Meta AI with what you are looking through your glasses in live view.

Similar to Google’s Gemini demousers can ask what they are looking at conversational questions in Meta AI, and how to solve the problem. Meta provided examples of giving me butter alternatives to meta AI based on what is seen when looking at the pantry.

You can still ask specific questions about the objects you are watching without live AI.

In addition to the new seasonal looks, Ray-Ban’s smart glasses can also use the “Hey Meta, Start Live Translation” command to automatically translate languages ​​that include English, French, Italian, Spanish, and more. Glasses speakers are translated when others are talking, allowing you to lift your phone, and you can also see the transcripts that the other person has translated.

In addition to these AI upgrades, smart glasses can automatically post to Instagram or send messages to messaging messengers with the appropriate voice commands. New compatibility with Music streaming service You can also play songs Apple Music, Amazon Music and Spotify Instead of using your glasses Earphones.

Meta reports that the rollout of these new features will be coming this spring and summer, along with an update to object recognition for EU users coming next week.

Meta and Ray-Ban did not immediately respond to requests for further comment.



Leave a Reply

Your email address will not be published. Required fields are marked *