But it does also feel like a now-or-never moment for its AR translation glasses to take a sizable step towards becoming reality. Google already has plenty to talk about at Google IO 2023, from the new Google Pixel Fold to Google Bard, and the small matter of Android 14. Google added that Translate will, for a handful of supported languages, start using “the right turns of phrase, local idioms, or appropriate words depending on your intent", allowing translated sentences to match how a native speaker talks.Īll of this again sounds like an ideal foundation for AR translation glasses that work a little more like Star Trek's Universal Translator, and less like the clunky, staccato experiences we've had in the past. It will, for example, be able to understand if you’re talking about ordering a bass (the fish) for dinner or ordering a bass (the instrument) for your band. Still, Google has made advances elsewhere, revealing in February that Translate will soon become much better at understanding context. This USM, which Google describes as a "family of state-of-the-art speech models", is already used in YouTube to generate auto-translated captions in videos in 16 languages (below), allowing YouTubers to grow their global audience. In March, Google's AI research scientists excitedly revealed more information about its Universal Speech Model (USM), which is at the heart of its plan to build an AI language model than supports 1,000 different languages. Will Google actually be in a position to announce a development for its AR Translation glasses at its big developer conference next week? So far, there have been no rumors or leaks to suggest it will, but in recent months it's become clear that Google Translate, and languages in general, remain one of its big priorities. With Meta apparently planning to launch those AR glasses, currently dubbed Project Nazare, sometime in 2024, the heat is definitely on for Google to get moving with its AR translation specs – and ideally that means announcing some more concrete news at Google IO 2023. Crucially for Google's AR translation glasses, Meta also promised that "with improved efficiency and a simpler architecture, direct speech-to-speech could unlock near human-quality real-time translation for future devices, like AR glasses". (Image credit: Meta)Īs the name suggests, this project aims to use machine learning and AI to give everyone "the ability to communicate with anyone in any language", as Mark Zuckerberg claimed. Cue Meta's screeching left-turn from the metaverse towards its intriguing side-hustle, announced in February 2022, to make a 'Universal Speech Translator'.Ī graphic from Meta's original blog post about its plan to create a Universal Speech Translator. Sadly, the Air Glass 2 are unlikely to launch in western markets, and it's actually more likely that one of the other tech giants could steal Google's live-translation thunder. Oppo's Air Glass 2 managed to trump rivals like the Nreal Air and Viture One by offering a wireless design, which means you don't need a cable to pair the specs with your Oppo smartwatch or phone. TCL isn't the only company to produce a working prototype of concept glasses similar to what Google showed last year. Ideally, both people need to be wearing AR glasses for a full conversation to take place, but it's a good start. While there was a slight delay of two seconds between a person speaking and their question being translated into text at the bottom of the glasses, Matt Bolton was able to have a whole conversation with someone who was speaking entirely in Chinese. TCL RayNeo X2s are the most convincing AR translation glasses (Image credit: TCL)
0 Comments
Leave a Reply. |