Earlier this week, the National Republican Campaign Committee was caught red-handed pushing online content aimed at confusing Democrats in hopes of dividing potential 2020 voters against each other. This attack was, however, very clumsy in comparison to the warnings that party leadership, state organizations, and outside activists receive in our conversations with security experts.
Major companies are worried about technology attacks on the 2020 election, and companies like Microsoft are sending out alerts about to state-sponsored hack attempts. Events like these will make the news, and organizations will have to prepare by making sure to secure their election operations.
When you talk to security experts, former FBI members, and campaigns, these kind of attacks aren’t the ones that are the most unsettling. There is another form of attack that’s insidious yet direct. I’ve referred to it in other diaries as “poisoning the water,” while a security adviser in a recent meeting I attended referred to it as “28 Days Later.” No matter what you call it, the desired end result is straight-forward: convince Democratic voters that the system isn’t worth participating in, or encourage rage and disappointment no matter who the Democratic nominee in their district, and for U.S. president.
The effort to undermine our democracy is already well underway.
There are thousands of ways ill-doers can try to influence our elections. To try and prepare, I think it is best we work on understanding at least a few of the approaches being used to poison discussion and try to distract Democratic voters.
The Friend Connection
Are you a part of “secret” or “closed” Facebook groups? Joined any DM groups on Twitter? More and more users of social media platforms who are politically active find themselves in hidden, secret, or closed groups, which far too many view as “safe spaces.” According to cybersecurity experts, however, these can be used to manipulate participants, sow discord, and try to fracture groups. In some cases, individuals have found themselves to be targets of outside pressure, as members use supposedly “private” material in public spaces to try and harm other members.
The most common method of abuse in groups such as this uses five steps: entry, harvesting, targeting, manipulation, and splintering.
Step 1: Entry. Using Facebook profiles or Twitter handles, individuals work to get themselves invited into private or secret groups or DM feeds on Facebook. If all members can invite new members, then the larger the group, the easier it is to gain entry. The smaller the group, the harder it is to gain entry. Larger groups also hide members, as most rarely check out who exactly is a member if it is several hundred individuals, or really, anything over 25 members.
Step 2: Harvesting. Once a user has access to a private group, the next item on the list is to harvest, which consists of keeping a constant tally of the membership and the means to contact them. They then begin sending friend requests or start direct communication on Twitter or any other platform. This is a slow means by which to build a relationship with someone who doesn’t really exist, and the account seeking to do harm builds credibility by having more connections with people you know, therefore “they must be okay.”
Step 3: Targeting. No large-scale conversation between human beings is completely unified. It is why several experts say we should focus on being harmonious, rather than unified. It’s okay to disagree now and again and it often doesn’t lead to larger problems, but individuals looking to cause harm to Democratic Party efforts keep track of disputes and conflict in hopes of capitalizing on them later. By having a list of the individuals in a group or effort, they can aim their messaging directly at specific users, behind the scenes.
Step 4: Manipulation. Once they understand issues which can cause friction, they look at ways to inject, often in direct conversation rather than in the group at large, information which can portray a candidate or a cause as “wrong.” This manipulation is often slow and steady—never so rushed that the listener can quickly sense someone is intending to make them dissatisfied. Often, these manipulations come across as a mix of hope and remorse when talking about a certain candidate. “How can we get them in the right space for X,” “I can’t believe they won’t listen to you,” or a mix of laying out items, pumping up the listener, and defining the action as a letdown. Enough division then leads to …
Step 5: Splintering. At a certain point, the result is: If they won’t listen to you, let’s just do our own thing. What does it matter, they won’t get involved/won’t do it/we know better/and so on.
And then, the cycle begins all over again. This simple method can be injected into campaigns, organizations, or other efforts, over and over and over again. And while we talk about digital efforts, security specialists warn us that more conservative activists are interested in taking this into real campaigns, beginning as volunteers and sometimes “very hard workers” so that early on they can become quickly trusted. Once they’re integrated, the pattern works exactly the same.
The credibility multiplier
You may not have heard of user P123ABC, and neither have your friends. They link to an article on Twitter from a source you have also never heard of—but it is about something you care about. Well, okay, maybe you’ll read it (or not). But who is this person, anyway? Thanks to others’ work through harvesting, that user can get their information out to someone they know will be interested in the storyline. That individual now has a much wider audience to hit with, “Hey, have you seen this?” One of the great psychological drives of individuals is to feel as though they have some “inside scoop” or knowledge that others don’t have. They found the news first, or maybe they just understand the news story in a way that others don’t.
Suddenly, a story packaged by an unknown user gets repackaged by a user with a bit more trust, and it gets retweeted or passed on. Now, it is seen by a broader audience and its credibility seems bolstered: “Hey, I know that person. If they are putting it out there, then they must like something about it or find it interesting.”
Finally, others repackage it again, sometimes without original attribution and reframed into their own words in their own Facebook posts and blogs, because they want to take credit for the article and conclusions being drawn. Sure, it may be just a wee bit of plagiarism, but who’s going to know? Lots of people have that thought. Now, you’ve put your credibility on the line with friends for an article or idea you saw circulating without first validating it.
It is this third step that is the danger. By the time an article or idea gets to the third step of reproduction, no matter how it was originally created, credibility begins to be established by the source, and suddenly even entirely incorrect information can become enough of a talking point that it can be injected back into the original conversation, and used to turn off more potential voters.
It isn’t just fake news: it is fake relationships and faked credibility.
How do you combat poisoning the water?
I refer to this as poisoning the water: It is a slow and steady system designed to enlist all of us, hoping that we jump to immediacy as our first goal, accepting data and conclusions without verification from multiple credible sources, instead of internalizing these ideas and spreading them on.
So, how do you combat it? In 2016, Hillary Clinton campaign operatives honestly believed that continual online pushback on these narratives could be harmful because the pushback could be used to spread more disinformation and confuse voters. What we effectively learned is this is likely the wrong strategy.
Now, more security experts are advising us that the smarter approach might be to quickly combat these narratives with strong sourcing and credible narrators and by recruiting more activists for the pushback against these manipulative strategies.
It means that everyone, whether they are a reader of Daily Kos or a Twitter follower or just someone that hangs out in Facebook groups, has to become a bit of a journalist. Be willing to ask the big questions: Who, What, Where, When, Why, and How?
Ask the question about how you received information. Seek out known credible sources. If interpretations seem overly dark or cynical, question what motivates this conclusion, and recognize there is zero shaming in discovering you might have been manipulated.
We are all in this together—and we are running out of time. If we want to protect democracy, it is important that we all recognize the tools being used to tear it apart.