It’s a balmy Monday afternoon in New York City and I’m seated in a quaint Thai restaurant on Manhattan’s Upper East Side. I’m not here to slurp down some pad Thai, though. Instead, a handful of journalists and I are meeting with Google’s Duplex team to see the AI-powered system in action.
Duplex, you’ll recall, debuted at Google’s I/O conference in May, where CEO Sundar Pichai wowed the crowd by demonstrating how its new Duplex technology can make phone calls to places like restaurants, hair salons and stores to make reservations, request appointments and check for holiday hours.
That demo was so impressive it left many in the audience questioning whether it was staged. Which brings us to THEP Thai where Google (GOOG, GOOGL) is showing off Duplex in action. And after witnessing the technology firsthand, I can tell you it is very much real — and even more realistic than I initially realized.
Who ya gonna call?
Duplex lives inside Google’s Assistant. To use it, you simply activate Assistant using your Google Home or via the smartphone app. You can then ask Assistant to place a call to make a reservation at a restaurant for a specific time for a certain number of people, and it’ll take care of the rest. You’ll then receive a notification in Assistant and an email confirming your plans.
The AI isn’t going to be available to everyone right off the bat. Instead, Google is rolling it out to a small number of businesses and trusted users. It’ll start with checking businesses’ holiday hours, then move on to restaurants and salons. So don’t expect to be able to use Duplex any time soon.
Google recognizes the technology is still being worked out. At this point it is able to complete a call four out of five times. When a call fails, however, it falls back to, believe it or not, a human operator. During our demo in NY, one journalist managed to trip up the AI enough to bring up the operator who had to sort things out.
Google actually used human operators to train the algorithms that power Duplex. In developing the technology, Google had operators call restaurants to make reservations, make hair appointments and check store holiday hours in order for the company to better understand how such conversations usually flow.
From there, the company’s engineers built out the Duplex system so that it could make its own calls that operators would then guide. That brings us to Duplex’s current permutation in which the system can largely handle conversations on its own, but can still signal to an operator when it gets in over its head.
The “uhs” and “ums” over a conversation
During my phone call with Assistant, it asked for a reservation on a Sunday, and I replied that the restaurant was closed just to see what would happen. To my surprise, the system reacted just as a normal person would. It said alright, thanked me and hung up. It was eerie how realistic it was.
When another journalist told the Assistant the restaurant didn’t take reservations for parties smaller than 10, it immediately asked what the wait would be like. It was truly impressive to see this system mimic thought in real time.
Adding to that is the fact that the Duplex-powered Assistant sounds so much like a real person. Throughout the demo, the system used the same kinds of inflections and tones you’d expect to hear while talking to a restaurant worker or salon booker.
Google says it worked with linguists to understand how and why people use “ums” and “uhs” in conversations. The result is an AI with an uncanny ability to mimic human speech patterns. If I didn’t know I was listening to an AI, I wouldn’t have been able to tell the difference.
In fact, Google had to ensure that when you’re speaking to Assistant, it properly identifies itself and indicates that it’s calling for a client. That’s important because Assistant records your calls as a means to improve its accuracy. If you don’t want Assistant to record the call, it will automatically switch over to the human operator.
Who’s it for?
The real question here is just who is the Duplex-powered Assistant for? Google says the AI will make it easier for businesses that don’t have online booking services to bring in customers who might not otherwise have called on their own. For Google, the service will surely help the tech giant better build out its AI features, further growing its massive knowledge pool.
If and when the service becomes available for all Google users, it will likely take time for people to become acclimated to letting a program make calls for them. But once the initial awkwardness is over, I can see this technology being genuinely useful. Just imagine being able to do things like book a doctor’s appointment without having to call the office and wait on hold for 20 minutes. Or having Assistant call your mechanic to schedule a tune-up for your aging Cutlass Supreme.
Use cases like that are still quite a way off to be sure. But if Duplex is a success when it’s released into the wild, it could be the beginning of the end of annoying phone calls — and the first step toward widespread acceptance of AI for us all.
More from Dan: