How Secure Is Your ChatBot Conversation Against Eavesdropping?
ChatBots have been around for a number of years and are growing in popularity especially on websites to provide customer service and support. Many industries are turning to ChatBots to offer a better customer experience which often means sharing of personal, and even sensitive information, as part of the conversation. An example is a bank a using a ChatBot to aid a client in accessing details about their accounts including balances, contact details, and identity data. Another example includes companies providing employees access to human resources information such as payroll, tax deductions, vacation days, benefit claims, performance reviews, and much more. ChatBots offer a powerful user experience but their ability to offer easy access to data should be carefully scrutinized.
What Is a Conversational Platform
At the heart of any ChatBot is a conversational platform. Conversational platforms can be used to build conversational user interfaces, chatbots, and virtual assistants to address numerous use cases. They offer integration into chat interfaces such as messaging platforms, social media, SMS, and other forms of chat. Leading conversational platforms offer application programming interfaces or APIs to facilitate the integration and extension of other systems and the customization of the platform itself.
Messaging Apps and ChatBots
Popular messaging apps such as Facebook Messenger, Skype, WhatsApp, Line, and WeChat have become a part of the daily life for most consumers and even professionals. They offer an efficient and instantaneous way to communicate and share information. ChatBots send and receive messages through existing messaging apps. When a user sends a message to a ChatBot, or vice versa, a request is sent to the parent platform, where the identity of the sender is verified before the data request is fulfilled. This allows ChatBot users to get information through the messaging platforms they already use.
We have become comfortable using messenger apps to share personal and sensitive information when communicating with friends, family, and colleagues. There is an implicit expectation that messenger app providers are safeguarding their users’ privacy and conversation data but this is not always the case. The security of messaging apps has evolved quickly over the past decade to address their growing popularity and also threats from nefarious attackers. The conversation between a user and a ChatBot is also thought to be private unless a user explicitly consents to the information being shared publicly or with a third party. The maturity of data security and privacy for both ChatBots and their underlying conversational platforms is still in question and in need of serious improvements.
Artificial Intelligence- or AI-Powered ChatBots
A major value of ChatBots is their ability to engage in complex and human-like interactions with users – through text or speech. The ability to understand intent and appropriately converse with a user is based on artificial intelligence or AI. Data is at the heart of AI-powered ChatBots and is used to enrich the conversation, improve the system, provide personalized a experience, and deliver actionable insights. It provides the information necessary to increase customer engagement, to make more informed business decisions and provide vital competitive differentiation. Even when the data has been anonymized, it still holds a wealth of information that enterprises can learn from and act on. Conversational platforms facilitate the integration to multiple applications and data sources to ensure an AI-powered ChatBot has access to the information it needs to effectively fulfill its purpose.
Privacy and Security with Conversational Platforms
Privacy and security concerns exist on two fundamental elements of a ChatBot’s infrastructure: the interaction with the messaging app and the integration with the data sources or other applications such as a natural language processing (NLP) library. This is where vulnerabilities and the potential for exploits exist. It is critical to understand every aspect of the architecture and underlying architecture to ensure any of the vulnerabilities are not exploited by bad actors (hackers) to gain access to user data or the ChatBot itself.
How to Secure a ChatBot
It goes back to the conversational platform and understanding how it is deployed and secured. It starts with where it is hosted, and the level of security and controls used in the infrastructure. It is just not enough to know that it is deployed in AWS or Azure. The devil is in the details to make sure that the right safeguards are in place. In addition, there are five other key considerations that should be an integral part of any conversation platform.
N-tier architecture is also called multi-tier architecture because the software is engineered to have the processing, data management, and presentation functions physically and logically separated. These different functions are hosted on several machines or clusters, ensuring that services are provided without resources being shared. Such an architecture helps ensure reliable performance and provides higher security since each of the tiers can be secured separately using different methods.
All communication between messaging channels and the conversational platform needs to be out-of-band using secured data transfer protocol (HTTPS) which also encrypts all data in transit. The conversational platform needs to perform an integrity check of all incoming requests, comparing the header token (encrypted unique identifier generated by messaging channels) and metadata (encryption information as generated by the third party) to ensure that there are no malicious activities. All communication tokens need to be time bound and expire after 60 minutes.
Protecting HTTP headers
The conversational platform should also be protected against cross-site scripting and http header vulnerability through the use of ‘Helmet’. Helmet is a collection of nine smaller middleware functions that set security-related HTTP headers.
Avoid persisting (storing) data beyond the conversation
The conversational platform should never store any user or session data. Even the in-memory transient data has to be deleted as soon as a conversation is over. If at any point conversational data is stored on a temporary basis to assure conversational integrity, the data has to be encrypted and available only to the user or the ChatBot.
Use token-based authentication
Requests for data through ChatBots are verified using authenticated tokens, which allow users to verify their identity without repeatedly entering their login credentials. While logged into a conversational platform, the user sends a service request (message) through the messaging app. After verifying the user’s identity, the messaging app generates a secure authentication token, which is relayed to conversational platform along with the service request. The receipt of the authentication token allows the conversational platform to verify the identity of the user and proceed with the request. It is recommended to implement additional authentication processes if the ChatBot will have access to personal, sensitive, or regulated data. The conversational platform needs to provide a framework to easily integrate with Enterprise Identity Management solutions through APIs exposed by the service providers.
The first time, a user attempts to execute a secure transaction, the user is routed to a login page which is displayed in the “Web View” section of the messaging app or in a separate browser session altogether. Upon successful login, a user access token is generated and stored on the conversational platform’s server. The token is passed along with all requested transactions to check its validity at service provider’s end. All tokens need to be set to expire within a configured time frame.
ChatBots offer a unique opportunity to engage with many audiences to address a rich set of use cases. The underlying conversational platform needs to offer a sophisticated set of capabilities, and especially security and privacy safeguards, for anyone looking to leverage a ChatBot.