Last year, 80% of people interacted with a chatbot at least once. These ubiquitous computer programs are designed to simulate conversations and help site visitors find information, access support, and resolve basic issues without contacting a real employee.
Traditionally, it’s been relatively easy to tell whether you’re talking to a chatbot. They are only equipped to answer certain simple questions and cannot mimic human language in a believable way.
However, more robust chatbots are being introduced. These rapidly evolving technologies can simulate real conversations much more seamlessly. While they could have many benefits for businesses and consumers alike, some experts wonder whether the smarter chatbots can be weaponized for a variety of malevolent purposes, with the upcoming election cycle being of particular concern.
The question is, will intelligent, weaponized chatbots sway voters, or will the majority of constituents see through this ruse? Here is what you need to know about chatbots as you and your team gear up for the 2024 election cycle.
The Benefits of Chatbots
For businesses, chatbots offer a variety of benefits. That is why so many organizations have scrambled to adopt this new technology in recent years. As is often the case, early adopters of new technology enjoy a distinct advantage over businesses that are not as forward-thinking.
For starters, chatbots allow companies to communicate with prospective customers at all times of the day. Individuals interested in buying a product or signing up for a service don’t have to wait until normal business hours to start collecting information. Instead, they can browse a company’s website and submit contact information to a chatbot to initiate the purchase.
Additionally, chatbots alleviate the burden on a company’s internal service team. Chatbots act as a filter that can handle many rudimentary requests, giving employees more time to address complex support needs.
Chatbots are also widely used in the political arena. Campaign websites make use of chatbots to gather information about volunteers and potential supporters. These data collection tools promote campaign scalability and allow political strategists to reach a wider audience.
The Path to Weaponization
Chatbots are generally quite easy to implement. This appealing attribute has also made them susceptible to weaponization. Many chatbots are installed using a “bolt-on” approach, adding them outside the traditional computer environments.
Chatbots are added to a website via third-party software-as-a-service or stored in specialized containers. This means that they are connected to a company’s computing environment but not fully integrated into it. As such, they create a broader attack surface for hackers but operate on the fringe of many security programs. Adding chatbots to already complex computer environments makes governance even more difficult.
3 Ways Chatbots Can Be Weaponized
Although chatbots have numerous benefits, these convenient pieces of technology can be weaponized in several ways. The three top concerns regarding chatbot weaponization involve the following:
One of the most basic functions of chatbots is data collection. For instance, if an individual communicates with a chatbot to learn more about a service, they will likely be prompted to provide their name, ZIP code, and address. They may also have to provide their credit card details.
Since chatbots operate on the fringes of an organization’s cybersecurity infrastructure, they are susceptible to skimming. Once skimmed, data can be used for a variety of nefarious purposes, including making fraudulent purposes or requesting party or address changes with voter registration organizations.
A remote access trojan (RAT) allows hackers to remotely control a device. To function, RATs rely on a set of built-in commands, which are triggered by certain code words or actions. The RAT will set up a command and control channel between the user’s device and the attacker’s server. That server will send trigger commands, prompting the RAT to obtain and send data back to the hacker.
ChatRATs or chat-based RATs are precisely what they sound like: remote-access trojans hidden with chatbot software. Unfortunately, chatRATs are difficult to detect, meaning they could gather data for weeks before being discovered.
Chatbots have moved beyond the landing page of websites. Today, chatbots can be found within messenger applications, in comment sections, and group chats. Like the bots found on websites, these chatbots primarily function on a read-and-react basis. They will process certain trigger words and publish a boilerplate response.
However, more sophisticated chatbots may be able to present themselves as human users and participate in online conversations. These dynamic solutions can spew misinformation or publish inflammatory content for the purpose of misleading voters.
So Are Voters Susceptible to Chatbot-based Attacks?
The easy answer is yes; voters are susceptible to and may be swayed by weaponized chatbots. While chatbots vary significantly in terms of sophistication, more dynamic iterations can be quite convincing. The same can be said of misinformation in general.
According to Statista, 26% of Americans are confident that they can recognize misinformation. However, 67% believe misinformation causes a significant amount of confusion. Most concerning is that 38.2% of Americans have accidentally shared misinformation.
Regardless of the source or delivery method (chatbots, social media, spoofed websites), misinformation can have an impact on voters and sway the direction of an election.
Insulate Your Campaign From Chatbot Weaponization With Political Data
Targeting core voter demographics and presenting them with timely, relevant, and factual information is a great way to mitigate the impacts of chatbot weaponization. However, before you can devise an effective targeting campaign, you need abundant, high-quality, reliable voter data.
As a leader in political data, Aristotle can connect you with the raw data and data analysis tools necessary to power your campaign. Schedule a demo to learn more.