Sections

Commentary

Chatbots move public sector toward artificial intelligence

People work on their computers during a weekend Hackathon event.

Chatbots are becoming an integral part of our society, as tech companies invest significant resources in developing and implementing the technology to improve user experience. In simple terms, chatbots are computer programs that leverage machine learning and artificial intelligence (AI) to complete tasks while mimicking human conversation.

Public agencies are developing their own chatbots to transform their service delivery. First, agencies are reducing employees’ workload and response times by delegating mundane and routine tasks to chatbots, saving human labor for more technical and nuanced tasks. For instance, North Carolina’s Innovation Center (iCenter) is piloting chatbots to free up internal IT help desk personnel from answering mundane queries (e.g. password resetting). The iCenter found that 80 to 90 percent of queries submitted to the IT help desk involve recovering account information such as passwords and user IDs. By leveraging AI to help with the routine requests, IT personnel can focus on complex issues that require human assistance.

Second, public agencies are using chatbots to connect with citizens and engage diverse stakeholders in addressing social challenges. Cities in the U.S. are utilizing text-based services to aid citizens and government employees: the city of Mesa, Arizona is testing a text message chatbot that can answer frequently asked questions about available services. Residents can use text messaging services to ask questions about their billing information or updating credit card information. Elsewhere, public agencies are using chatbots to help clients complete transactions. For instance, the Australian Tax Office deployed a chatbot called Alex in March 2016 to help citizens with questions related to taxes. Alex has already conducted more than a million conversations with citizens. These examples show how chatbots improve service delivery and help government better respond to citizens’ needs.

Third, public agencies are using chatbots to receive instant feedback and understand citizens’ perspectives about issues. Gwinnett County in the Atlanta metro area used Textizen, an interactive text messaging platform, to engage residents about the future of local transportation. The effort focused on collecting residents’ comments and opinions about improving county transportation services. The county received more than 1,400 survey responses and 2,700 text survey responses in a week, and the data is presented visually to track progress over time. By using chatbots to conduct surveys and gather information in real-time, public agencies are opening up new avenues to hear citizens’ voices about issues facing communities.

New solutions, new challenges

While these new tools offer several benefits to revamp public service delivery and citizen engagement, they also give rise to new challenges and concerns. As with any technology, one must consider how representative the sample of users is. From a policy perspective, this is a critical concern because of the digital divide: how many people have access to internet or smartphones to avail these services? Policymakers have long grappled with the issue of designing electronic services that cater to the needs of all citizens. If the goal is improving customer services and including marginalized populations in decisionmaking, public agencies must examine the demographics of people using chatbots.

The use of chatbots will give rise to new ethical and liability concerns. What happens if artificially intelligent chatbots learn harmful behaviors from their interactions with people? Developers of chatbots cannot predict every scenario about how these tools will learn and evolve as they interact with users. Further, the AI based systems are outpacing the laws and regulations governing their development and use. As public agencies increasingly adopt chatbots, they need to think critically about the issue of liability when the systems act unexpectedly. Building chatbots that are ethical will become an important issue, and public agencies should think creatively about the rules and regulations governing their use.

As chatbots become ubiquitous, regulators need to think about developing rules to manage security and privacy concerns associated with the use of these new tools. Hackers and scammers could use chatbots to gather valuable personal information by contacting organizations and posing as clients. Similarly, hackers and scammers can design bots to target unsuspecting users. The issues of security and privacy will become more complicated as chatbots carry out more tasks and transactions.

An evolving technology

In sum, chatbots are like previous waves of technological change. Consider the growth of computers in the public sector and society in general over the last four decades. Computers perform the task of gathering, storing, and processing data once stored in paper files. Interacting with governments and businesses online slowly moved from big and heavy computers to smartphones. When Apple introduced iPhone and App store, businesses and government agencies struggled to build apps that would fit into tiny mobile phones screen, but over time, more mobile apps for providing services were launched. Chatbots are likely to experience a similar trend and we will continue to learn how to adopt and use these new tools.

As chatbots are evolving and becoming sophisticated, public agencies can leverage these powerful tools to provide citizens personalized solutions and gauge public sentiments in real-time. At the same time, it is important to remember that these tools are new in the public sphere. For instance, the use of chatbots may automate work and result in labor displacement. Chatbots have transformed mundane routine tasks so far, but they are likely to become more sophisticated and may require little or no human intervention. Also, these chatbots may further widen digital divide, creating losers and winners depending on who is able to access them.

We are yet to fully understand the challenges associated with these emerging chatbots. As we move forward, public agencies may need to think about developing rules and regulations for effectively governing these powerful tools. Thus, it is important to pair caution with hope as we navigate the world of machine learning and AI-based technologies.

Authors