May 3, 2025
Trending News

Google Assistant can be activated with your gaze. Goodbye to ‘Ok Google’ is approaching

  • August 4, 2022
  • 0

In December 2020, Google announced ‘Look To Speak’, an app designed to help people with disabilities speak through their eyes. With the learning acquired two years later, this

In December 2020, Google announced ‘Look To Speak’, an app designed to help people with disabilities speak through their eyes. With the learning acquired two years later, this technology Become an integrated item within products with Google Assistant.

Aim? No need to say the weird ‘Ok Google’ to activate Google Assistant anymore and when can it be activated automatically? catch our eye. Google explained how this technology works and the challenges they face.

Assistant listens to you, but only if it’s “watching you”

While ‘Ok Google’ remains one of the cornerstones of enabling Google Assistant, Google’s ‘Look to Speak’ technical description starts off strong: “In natural conversations, we don’t say people’s names every time we go”.

Google wants Assistant to be as human-like and natural in its interaction as possible, including being able to start talking to you when you make eye contact. To achieve this, they announced ‘Talk to Speak’ at Google I/O 2022 and now they have announced that the device is available for the first time. Analyze audio, video and text simultaneously.

When we looked at the Google Nest Hub, a device with this technology, model building was not as simple as activating the Google Assistant. Function only activated if the model detects that we want to interact with it. To do this, the subject’s distance from the device, head direction, gaze, determining whether the subject’s direction is optical for an active conversation, etc.

For these analyses, frames of both video and audio input, to predict whether the user is talking and interacting with the home environment (for example, Assistant detection should not be triggered if we are talking to someone at home). Voice input depends on Google’s Voice Match, so Assistant won’t interact with anyone whose voice it doesn’t know.

What is particularly interesting about the sound is that the model detects that we are trying to interrogate the wizard Analyzing non-lexical information. In other words, tone of voice, velocity and some contextual signals are analyzed to understand if we want to make a query.

Currently, the ‘Look To Conversation’ feature is reserved for the Nest Hub, but it won’t be ruled out that it will reach more Google devices.

Source: Xataka

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version