Upgrade Your Style and Tech Game with the Latest AI-Powered Meta Ray-Ban Smart Glasses – Limited Access Available, Act Now!

Mark Zuckerberg’s company, Meta, has announced that their new Ray-Ban Meta smart glasses will be able to analyze your surroundings, similar to Google Lens, starting in 2024. However, U.S. smart glasses owners can get a head start if they’re quick. The new multimodal AI features include Bing search results to your queries. These features are currently available via an early-access beta in the U.S., but will roll out globally next year.

To use the new feature, smart glasses owners can simply say “Hey Meta,” wait for an acknowledging ding, and then say “Look and tell me [blank]” to get information about whatever they’re looking at. The glasses will take a photo of the object and analyze it, using a combination of Meta’s AI tech and the Bing search engine to provide answers.

The company’s blog announcement post revealed that the AI and Bing will be able to answer questions about sports scores, local landmarks, restaurants, stocks, and more. They did note that the multimodal AI features may not always get it right and are currently using beta testers to iron out any bugs.

The feature is currently available via an Early Access mode in the Meta View app, but is limited to a small number of people who opt in. However, Meta plans to fully launch the feature next year. This news is particularly promising as it addresses the previous weak points of the smart glasses, such as struggling with more complicated queries.

Overall, the Ray-Ban Meta smart glasses have received positive feedback for their high-res portrait camera and solid audio. With the addition of these new AI features, they are poised to surpass most other smart glasses on the market. The glasses come in multiple styles and colors, allowing users to take stylish and high-res photos and share them instantly to Instagram.