Google Assistant is about to get seriously smart.
At the search giant’s annual I/O developer conference, CEO Sundar Pichai unveiled a new technology called ‘Duplex’ that enables its Google Assistant to make phone calls in real-time with actual humans.
It can book a hair appointment and reserve a table for you at your favorite restaurant, among other things.
That’s on top of a slew of other tools that Google says can make it easier than ever for you to interact with your smart devices.
At Google’s I/O developer conference, CEO Sundar Pichai unveiled a new technology called ‘Duplex’ that enables its Google Assistant to make phone calls in real-time with actual humans
Duplex, which Pichai says the firm has been working on ‘for many years’, will be rolling out to a limited number of users for now.
Pichai showed off the new technology at the I/O conference, which kicked off on Tuesday in Mountain View, California and runs through Thursday.
In a demo, Google Assistant dials up a local hair salon to schedule an appointment.
First, a user asks Google Assistant to make them a hair appointment, which prompts Assistant to make the phone call.
It sounds like any other women talking, but one half of the conversation is being held by Google’s AI-infused digital assistant.
Google Assistant is able to work out a time and date for appointments, even when the salon employee says there are no appointments available at the time the Assistant suggested
Google Assistant is able to work out a time and date for the appointment, even when the salon employee says there are no appointments available at the time the Assistant originally suggested.
The Assistant even replied ‘Mhm’ in a natural, believable way when the employee asked her to wait a moment.
A second demo showed how Assistant can book a reservation at a restaurant.
In a live conversation, Assistant is able to field many questions and even knows to ask how long the wait is without being prompted.
Assistant then sends a notification to the user to let them know that an appointment has been scheduled.
The technology is poised to bring big changes to how we interact with our voice-activated devices, which is why Google executives told CNET that the firm will ‘proceed with caution’ in rolling out the technology to everyone.
Google said it used a combination of natural language processing and machine learning, among other technologies, to develop Duplex.
‘The Assistant can actually understand the nuances of conversation,’ Pichai explained.
‘We are still developing this technology and are working hard to get it right’.
Additionally, Google revealed that it’s launching six new voices for the Assistant to make it more natural and conversational.
The goal is to capture dialects, languages and accents more accurately and realistically on a global scale, the firm said.
What stole the show, however, was when Google announced that award-winning R&B artist John Legend would be voicing some commands for Google Assistant.
Legend’s voice will only appear in ‘certain contexts’, like singing Happy Birthday, as he was shown demonstrating in a video.
Google revealed a bevy new features for the Assistant that make it capable of understanding the ‘social dynamics’ of a conversation.
‘It gets a little annoying to say ‘Hey Google’ to get Google Assistant’s attention,’ Scott Huffman, Google’s vice president of engineering, told the audience.
‘It shouldn’t be so hard. Now you won’t have to say ‘Hey Google’ every time’.
Huffman announced that Google Assistant is getting new features called ‘Continued Conversation’ and ‘Mulitple Actions’.
Continued Conversation is rolling out in the coming weeks, while Multiple Actions is available now.
With Continued Conversations, users don’t have to say ‘Hey Google’ each time they make a request.
Google revealed that it’s introducing six new voices to its AI Assistant, including singer John Legend (pictured above), whose voice will respond in ‘certain contexts’, the firm said
Google is launching six new voices for the Assistant to make it more natural and conversational. The goal is to capture dialects, languages and accents more accurately
For example, if you ask Google Assistant what the weather is, you don’t have to use the wake word if you want to know if it will rain after she replies.
‘You can have a natural back and forth conversation without having to repeat Hey Google for every follow up request,’ Huffman said.
Multiple Actions enables Google Assistant to answer several commands at once.
In a demo, Huffman showed how he can ask Google Assistant to turn on the Golden State Warriors game and turn on the popcorn machine.
Essentially, the AI is capable of recognizing and answering multiple requests that are nested into one question.
With that, Google wants to change how children interact with its digital assistant.
The firm is rolling out a feature later this year that rewards kids for talking politely to Google Assistant.
Now, the Assistant will thank your kid each time they say please, giving an encouraging reply like ‘Thanks for asking so nicely’ or ‘What a nice way to ask me’.
Google also announced that it’s incorporating new visual cues inside your phone to make communicating with your Assistant even more interactive.
When you ask Assistant to do something, it will bring up new visual elements on your screen.
Google CEO Sundar Pichai (pictured) revealed a bevy new features for the Assistant that make it capable of understanding the ‘social dynamics’ of a conversation
BATTLE OF THE HOME AI
Google’s $130 (£105) Home speaker is triggered by the phrase ‘Hey Google’ while Amazon’s Echo uses ‘Alexa’.
Amazon’s smart speaker is available in two versions – the full sized $180 (£145) Echo shown here, and a smaller, $50 (£40) version called the Echo Dot.
Amazon Echo relies on Microsoft’s Bing search engine and Wikipedia, while Google Home uses the company’s own Google Search.
Both Home and Echo are continually listening for commands, though Google and Amazon say nothing gets passed back to them until the speakers hear a keyword — ‘OK, Google’ for Home and ‘Alexa’ for Echo.
Google’s Assistant software is also able to answer follow-up questions on the same topic, in a near-conversation style, but Echo as yet cannot.
However, Amazon’s Alexa software has a wider range of skills on offer that enable it to link up with and control more third-party devices around the home.
A light comes on to remind you that it’s listening.
You can turn off the microphone temporarily, too.
For example, if you ask Assistant to turn down the heat in your home, it will trigger a graphic on the screen that illustrates how the temperature is being lowered.
What’s more, Google Assistant can now remember your favorite Starbucks order.
If you ask Assistant to order your regular item, it automatically knows that is and can suggest other menu items.
After responding yes or no, Assistant will ask if you’re ordering from your usual pickup spot and then places the order.
Google is working with Starbucks, Door Dash and Dominos on the feature, with more partners coming soon.
Google also announced that it’s incorporating new visual cues inside your phone to make communicating with your Assistant even more interactive
Assistant is also becoming even more personalized, integrating all your calendar appointments, the weather and your commute onto a dashboard that can be viewed in the app.
These updates are expected to land on Android phones this summer and iOS devices later this year.
Additionally, Google Maps is getting a dose of Assistant, in that users can now ask to share their arrival time with a friend so that they know when to expect you.
To do this, users can ask Google Assistant to ‘share my ETA’ and it’ll share that information via a message to another user.
This feature is expected to hit Google Maps sometime this summer, the firm said.