Security Risks of AI Bots Like Google Duplex

AI Bot Security(SecurEnvoy, May 2018) It seems the lines between human and AI are becoming increasingly blurred.  If you caught the recent demonstrations of Google Duplex, it is easy to see we may have difficulties knowing if we are talking to a human or a bot in the very near future.  But what downstream impacts will this have on business as we know it?

In the business to consumer environment, most enterprises are trying to drive efficiencies throughout the customer experience journey.  Companies set out to achieve goals such as the best end user experience, cost reductions, quicker response times and customer need anticipation through automated menus and self-service solutions.  Is seems technology has provided industry with the means to complete this task driven process with complete, imperceivable differentiation via automation. Or has it?

Trust and speed are critical elements within the customer experience.  While Duplex and similar AI bots may seem like a way to create a more familiar customer experience in the customer’s own language, regional dialect and so on, it also creates additional risks.

Bots and Risks Around Identity Verification

When customers interact with a Bot, part of the process should be to provide identity verification.  Often these processes entail providing personal information such as address, account number, then something unique like a mother’s maiden name or pin code – to an automated process.  But just like email, any security holes could redirect customers to an identical phishing bot.  And these bots could (should?) ask all the same questions, acquire security codes and go much deeper than a traditional phishing attack through email.

This kind of risk will make it difficult for any customer to develop enough trust to provide sensitive information as these kinds of breaches increasingly occur.  But rather than try to circumvent the AI bot approach and communicate with a real person, instead these future AI solutions should also embrace a more defined security approach to manage customer identity.  Any company that operates in a B2C marketplace should look to understand who their users are and provide the necessary means to authenticate them easily with their automated user experiences.

More robust security solutions not only seamlessly authenticate users, but they can develop a deeper trust with the customer that in many instances is easier and faster to use.  A user responding with (for example) unique passcodes via SMS, voice, facial ID or mobile applications can feel secure knowing the business transaction is trusted and the company cares about protecting their information.

While a new generation of AI bots rise in the business landscape, so do the risks to the customer relationship. Fortunately, there are tools such as SecurAccess and Authy to help mitigate those risks as we look to new efficiencies to create deeper relationships with customers at scale.

Source link