I’m not sure if you have managed to catch the headlines after Google’s annual developer conference at the beginning of May, but Sundar Pichai showed off something quite incredible in the field of AI, Machine Learning and Deep Learning; Google Duplex.
Duplex is an add-on to the brand’s Assistant technology, the AI assistant that appears on most Android devices and powers the Google Home ecosystem. The scenario Google posits is that you are too busy to do certain tasks like make a reservation at a restaurant or book a hair appointment. Mundane, every day problems that don’t cause a particular headache, but are often way down on our priority list and often get pushed or overlooked.
Google’s answer? We’ll do it for you! Now, your first response is probably going to be along the lines of “What if they don’t have an online booking system or any online presence at all? How can Google book something for me then?”. This is where Duplex comes in. Instead of scouring the internet for a booking form, email address or website, the assistant actually initiates a phone call to the required establishment. That in itself isn’t especially clever, admittedly, because by that intimation the only work it’s done is find a phone number and enter it in to the dialler for you to then speak to them. But it goes one step further than that. Instead of making you speak to the person on the other end, the assistant actually handles the ENTIRE phone call for you. It does all of this in the background with no other user interaction other than the initial request.
Google’s tech is so advanced at this stage that it is so much more than what people first expect when you give them that description of the tech. Most people would think of the kind of text to speech systems behind early phone sat-nav systems or verbal accessibility assistants such as Apple’s Voiceover.
This is where Duplex excels itself and made a lot of people in the crowd in Mountain View have to pick their jaws up off the floor. The assistant held a real conversation with 2 different humans and not just that, it sounded like a real person! It could even understand and respond to someone with a thicker accent speaking in what most would refer to as ‘Pigeon English’. All of this was thanks to the research and development Google has put in to its DeepMind project, taking real data and turning that to useful outcomes.
Even if this was a ‘staged’ or even ‘optimised’ situation, it is still really impressive technology and I believe that it is just the tip of the iceberg for this kind of technology can do. With machine learning and deep learning algorithms, the more data it gets fed, the more intelligent it will become. We are still a long way off from truly passing the Turing Test, but to a casual observer to the videos you couldn’t help but be slightly amazed by some of the human qualities that Google has managed to program in to the Duplex Assistant.
So, what does this all mean for people who suffer from social anxiety? Well, quite often, social anxiety can get so severe that even the idea of making a phone call can stir feelings of fear and stop them from doing certain every day things that can only be done where a phone call is needed. Take that human interaction away and the fear associated gets reduced along with it.
If the tech can spread its wings beyond phone calls for appointments it could be life changing for a large proportion of the population. Things like having to deal with customer service phone lines to query bills can be handled by the Duplex Assistant and then the outcomes of the conversation sent back to the user. It could then do any follow up calls afterwards.
Deep Learning and AI are going to be an area of discussion in the health arena for many years to come, with the amount of data we as a society are able to both generate and capture, the algorithms will only get cleverer and will be able to give better and better answers. The limitations of the technology are pretty limitless and the applications can be potentially life changing.
Watch it do its thing here: