Managing Mis-, Dis-, and Malinformation#
Mis-, dis-, and mal-information have become a pervasive threat to the election environment. Each has a specific definition. Together, we call them MDM.
Misinformation is false but not created or shared with the intention of causing harm.
Disinformation is deliberately created to mislead, harm, or manipulate a person, social group, organization, or country.
Malinformation is based on fact, but used out of context to mislead, harm, or manipulate.
MDM campaigns are engineered for influence, typically seeking to exploit the psychology of victims’ emotions, identities, political affinities, and existing societal rifts.
Both independent threat actors and large nation-state operations are capable of manufacturing MDM. Threat actors may have hundreds of human threat actors on payroll or choose to conduct operations via automated bots. When users encounter inaccurate information or intentional disinformation they may be unable to differentiate it from genuine information, sharing it and unwittingly influencing an even wider audience.
Influencing the political environment through social discourse is a tactic observed in well-funded and complex information attacks, but actors may have competitive, financial, or other motivations as well. Disinformation attacks can function by creating continued influence in a system or sector. Attackers may try to popularize perspectives and viewpoints in target demographics that lead to certain policy or political outcomes. Appearing as authentic citizens or a real customer base on social media, individual disinformation accounts can appeal to users and align with their existing beliefs. Organizations and individuals alike then experience the pressure to act on what is perceived as recurring legitimate messaging but, in reality, is deception.
Often, inaccurate statements about elections are unintentional and just the result of misinformed individuals. As election officials, it’s not always important to understand the source or intent of the inaccurate information, but to simply address it with accurate messaging and other remediative action. That is the focus of this best practice.
Recognize MDM and its potential impact on election administration (Level 1 maturity)
Take action when you encounter misinformation (Level 1 maturity)
For Managing Mis-, Dis-, and Malinformation, the necessary actions vary by maturity as detailed below.
Level 1 Maturity#
At Level 1 maturity, simple steps can help you manage misinformation and address it when it occurs.
Preparing for Mis-, Dis-, and Malinformation#
Set up multi-factor authentication to protect social media accounts from compromise.
Use public forums to actively counter misinformation.
Regularly publish official messaging about the state of your election infrastructure.
Work with local media to promote official sources of information.
Track important information by, for instance, following your county name and the names of your election official and other public figures.
Respond to inaccurate information with accurate information as quickly as possible. This rapid response is even more important as an election nears.
Election officials can report identified misinformation to email@example.com.
Report anything on social media that’s about your jurisdiction, pertains to the administration or security of an election in the United States, and is inaccurate or misleading.
Opinions are not misinformation. Only report inaccurate information about election administration itself.
Examples include, but aren’t limited to, dates of the election, mail ballot rules, ballot information, polling place hours and status, election night reporting procedures, post-election procedures, and voting technology.
Include the following information:
A screenshot of the social media post and, if possible, the URL.
Your name, role, jurisdiction, and work email address.
A description of why this is false – not just “this is wrong” but information about why think it’s wrong. This doesn’t have to be more than a couple sentences, but more detail is better. Citing a law is even better.
If appropriate, the EI-ISAC will work to have the inaccurate information removed or labeled.
Level 2 Maturity#
Organizations operating at a Level 2 maturity should take additional actions, including:
The Cybersecurity and Infrastructure Security Agency (CISA) offers resources to spot and manage responses to MDM:
Recognize the risk of foreign actor operations.
Question the source of content and question intent.
Investigate the issue for other reliable sources before sharing.
Think before you share; disinformation is designed to evoke an emotional response.
Talk with your circle about the risks of spreading disinformation.
Review the Harvard Kennedy School’s Belfer Center publication, “The Election Influence Operations Playbook” for a deeper understanding of these issues and response guidance.
Establish a mechanism for the public to report disinformation and misinformation to your office, such as an email or phone number.
Level 3 Maturity#
Organizations operating at a Level 3 maturity should take additional actions, including:
Consider having a focused workstream to identify and remediate MDM. This can include things like:
Tracking hashtags, keywords, and other trends on various social media platforms.
Following activity related to your election across a number of platforms, including smaller, niche apps.
Contracting with a third party to provide these services for you.
If a state, providing services for your locals.
Election Tools Checklist for combating election misinformation: A framework to help election departments respond to influence operations.
CISA Rumor Control Page: This page offers the public accurate and authoritative sources of information that will help address common MDM narratives. It is provided by a trusted voice to either preempt or respond to developing narratives.
Mapping to CIS Controls and Safeguards#
There are no relevant CIS Controls.
Mapping to CIS Handbook Best Practices#
There are no relevant Handbook best practices