This was possibly my favorite part of the keynote. Kind of blew my mind. The three things that stood out to me were:
- Very natural/human-like speech. (this really caught me off guard)
- Ability to handle new questions/direction of conversation.
- Ability to understand English spoken with an accent.
The machine is saying "Umm". And "Mmm-Hmmm".
- lots of questions (of both a practical and ethical nature) that need to be answered before this goes live
This is just proof-of-concept. They can iron out the kinks later. No need to overburdened themselves with practical questions at this stage.
Google assistant call a business. Google assistant answers. Nobody actually calls anybody.
Okay, first of all, this is not design related but HOOLY MOTHER what a time to be alive!
I'm going to add a holy crap! to the list. That's incredibly smooth
OK, perhaps it's just me but there were so many assumptions made by the AI it was downright scary. This was for a haircut. Wait until the conversation gets more complex. As a tech demo, this is neat. As a real world example, I'm far from impressed, and am downright scared for the future given the pace companies like Google are developing AI without any constraints whatsoever.
Best (overheard) comment: ”For decades companies had us robocalled — now we robocall them!“ ;)