Meta Ray-Ban smart glasses may get new AI features next month: What it means for users

Facebook-parent Meta is expected to add new AI features to its Ray-Ban smart glasses starting next month. According to a report by The New York Times, with multimodal AI features the smart glasses will be able to perform tasks like translation and identification of objects, animals and monuments. The report also notes that these new AI features have been in early access since last December and will soon be available for the public.

How these new AI features will workAs per the report, users will be able to activate the smart assistant of the Meta Ray-Ban glasses by saying “Hey Meta”. After activating the assistant, users can say a prompt or ask a question. The smart glasses will then respond to the prompt through the speakers built into the wearable.

The NYT report also offers a glimpse at how well Meta’s AI works. While testing the Meta Ray-Ban smart glasses, it was used in various scenarios including a grocery store, while driving, at museums and even at the zoo, the report noted.



As per the report, Meta’s AI was able to correctly identify pets and artwork. However, the smart glasses didn’t identify things accurately every time.

The report also discovered that the glasses went through a tough time while identifying zoo animals that were far away and behind cages. The smart wearable device was also unable to properly identify an exotic fruit, called a cherimoya, even after multiple tries.

While trying out AI translations, the glasses supported English, Spanish, Italian, French, and German, the report added. The report also expects Meta to continue refining these features with time.

Expand

Currently, the AI features in the Ray-Ban Meta smart glasses are only available through an early access waitlist and can be accessed only by users owning the device in the US.

Source link

credite