Meta has yet to announce the languages that will be initial available.
© 2024 TechCrunch. All rights reserved. For personal use only.
During Wednesday’s Meta Connect event, CEO Mark Zuckerberg announced a slew of new AI-fueled features for the company’s Ray-Ban collaboration. The most interesting of the bunch is the addition of real-time translation through the glasses’ speakers.
Meta explains:
Soon, your glasses will be able to translate speech in real time. When you’re talking to someone speaking Spanish, French or Italian, you’ll hear what they say in English through the glasses’ open-ear speakers. Not only is this great for traveling, it should help break down language barriers and bring people closer together. We plan to add support for more languages in the future to make this feature even more useful.
The companies have yet to announce a timeline for this specific AI addition, but depending on implementation, it could be a tremendously useful addition to the livestreaming glasses.
Live translation has been a kind of holy grail for established hardware firm and startups alike. Google, notably, introduced a pair of concept glasses with a head’s-up display capable of translating in real time. That, however, never made it past the prototype stage.
Meta has yet to announce the languages that will be initial available, though judging from the above statement, it seems them will initially be limited to romance languages like English, Spanish, French, and Italian.
Leave a Reply