In honor of MisinfoConÂ this weekend, it’s timeÂ for a brain dump on propaganda — that is, getting large numbers of people to believe something for political gain.Â Many of my journalist and technologist colleaguesÂ have started to think about propaganda in the wake of the US election,Â and related issues like “fake news”Â and organizedÂ trolling.Â My goal here is to connect this new wave of enthusiasm toÂ history and research.
This post is about persuasion. I’m not going to spend much time on the ethics of these techniques, and even less on the question of who is actually right on anyÂ particular point. That’s for another conversation. Instead, I want to talk aboutÂ what works.Â All of these methods are justÂ tools, and someÂ are more just than others.Â Think of this as Defense Against theÂ Dark Arts.
Let’s start with the nation states. Modern intelligence services have been involved in propaganda for a very long time and they have many names for it: information warfare, political influence operations, disinformation, psyops. Whatever you want to call it, it pays to study the masters.
Russia:Â You don’t need to be true or consistent
Russia has a long history of organized disinformation, and their methods haveÂ evolved for the Internet era. The modern strategy has been dubbed “the firehose of falsehood” by RAND scholar Christopher Paul.
His recent report discusses this technique of pushing out diverse messages on a huge number of different channels, everything fromÂ obvious state sources like Russia TodayÂ to carefully obscured leaks of hacked material — leaks whichÂ are tailored to appeal toÂ sympathetic journalists.
The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quantity does indeed have a quality all its own. High volume can deliver other benefits that are relevant in the Russian propaganda context. First, high volume can consume the attention and other available bandwidth of potential audiences, drowning out competing messages. Second, high volume can overwhelm competing messages in a flood of disagreement. Third, multiple channels increase the chances that target audiences are exposed to the message. Fourth, receiving a message via multiple modes and from multiple sources increases the messageâ€™s perceived credibility, especially if a disseminating source is one with which an audience member identifies.
And as you might expect, there is a certain amount of outright fabrication — often mixed with the truth:
Contemporary Russian propaganda makes little or no commitment to the truth. This is not to say that all of it is false. Quite the contrary: It often contains a significant fraction of the truth. Sometimes, however, events reported in Russian propaganda are wholly manufactured, like the 2014 social media campaign to create panic about an explosion and chemical plume in St. Maryâ€™s Parish, Louisiana, that never happened. Russian propaganda has relied on manufactured evidenceâ€”often photographic. … In addition to manufacturing information, Russian propagandists often manufacture sources.
But for me, the most surprising conclusion of this workÂ is that a source can still be credible even if it repeatedly and blatantly contradicts itself:
Potential losses in credibility due to inconsistency are potentially offset by synergies with other characteristics of contemporary propaganda. As noted earlier in the discussion of multiple channels, the presentation of multiple arguments by multiple sources is more persuasive than either the presentation of multiple arguments by one source or the presentation of one argument by multiple sources. These losses can also be offset by peripheral cues that enforce perceptions of credibility, trustworthiness, or legitimacy. Even if a channel or individual propagandist changes accounts of events from one day to the next, viewers are likely to evaluate the credibility of the new account without giving too much weight to the prior, â€œmistakenâ€ account, provided that there are peripheral cues suggesting the source is credible.
Orwell was right: “We have always been at war with Eastasia” really does work, ifÂ there are enough people repeatingÂ it.
Paul suggestsÂ that the counter-strategy is not to try to refute theÂ message, but to reach the target audience first with an alternative. Fact checking, which is really after-the-fact-checking, may not be the most effective plan. Â He suggests instead that we “forewarn audiences of misinformation, or merely reach them first with the truth, rather than retracting or refuting false ‘facts.'” In this light, Facebook’s plan to show the fact check along with the article seems like a much better strategy than sending someone a fact checking link when they repeat a falsehood.
He also suggests thatÂ weÂ “focus on guiding the propaganda’s target audience in more productive directions.” Which is exactly what China does.
China: Don’t argue, distract and disrupt
China is famous for its highly developed network censorship, from the Great Firewall to its carefully policed social media. The role of the government “public opinion guides,” China’s millions of paid commenters, has been murkier — until now.
The Atlantic has aÂ readable summaryÂ of recent research byÂ Gary King, Jennifer Pan, and Margaret E. Roberts. They started with thousands of leaked Chinese government emails where commentators report on their work, whichÂ became the raw data for an accurate predictive model of whichÂ posts areÂ government PR. AÂ surprising twist: nearly 60% of paid commenters will just tell you they’re posting for the government when you ask them, which allowed these scholars to verify their country-wide model. But the core of the analysis is what these posters were doing.
From the paper:
We estimate that the government fabricates and posts about 448 million social media comments a year. In contrast to prior claims, we show that the Chinese regimeâ€™s strategy is to avoid arguing with skeptics of the party and the government, and to not even discuss controversial issues. We infer that the goal of this massive secretive operation is instead to regularly distract the public and change the subject, as most of the these posts involve cheerleading for China, the revolutionary history of the Communist Party, or other symbols of the regime.
And here’s the breakdown of what these posters were doing. “Cheerleading” dominates for every sample of government accounts. Arguments are rare.
Note that this is only one half of the ChineseÂ media control strategy. There is still massive censorship ofÂ political expression, especially of any post relating to organized protest, which is empirically good at toppling governments.
All of this without ever getting into an argument.Â This suggests that there is actually no need to engage the critics/trolls to get your message out (though it might still be worthwhile to distract and monitor them.)Â Just communicate positive messages to the masses while you quietly disable your detractors. AÂ counter-strategy, if you are facing this type of opponent, is organized, visible resistance. Get into the streets and make it impossible to talk about something else — though note that recent experiments suggest that violent or extreme protest tactics will backfire.
But China has a tightly controlled media and the greatest censorship regime the world has ever seen. If you’re operating in a relatively free media environment, you have to manipulate the press instead.
Milo: Attention by any means necessary
The most insightful thing I have ever read about the wonder that wasÂ Milo Yiannopoulos comes from the man who wrote a book on manipulating the media, documenting the strategies he devised toÂ marketÂ people like Tucker Max.Â Ryan Holiday writes,
We encouraged protests at colleges by sending outraged emails to various activist groups and clubs on campuses where the movie was being screened. We sent fake tips to Gawker, which dutifully ate them up. We created a boycott group on Facebook that acquired thousands of members. We made deliberately offensive ads and ran them on websites where they would be written about by controversy-loving reporters. After I began vandalizing some of our own billboards in Los Angeles, the trend spread across the country, with parties of feminists roving the streets of New York to deface them (with the Village Voice in tow).
But my favorite was the campaign in Chicagoâ€”the only major city where we could afford transit advertising. After placing a series of offensive ads on buses and the metro, from my office I alternated between calling in angry complaints to the Chicago CTA and sending angry emails to city officials with reporters ccâ€™d, until â€˜under pressure,â€™ they announced that they would be banning our advertisements and returning our money. Then we put out a press release denouncing this cowardly decision.
Iâ€™ve never seen so much publicity. It was madness.
. . .
The key tactic of alternative or provocative figures is to leverage the size and platform of their â€œnot-audienceâ€ (i.e. their haters in the mainstream) to attract attention and build an actual audience. Letâ€™s say 9 out of 10 people who hear something Milo says will find it repulsive and juvenile. Because of that response rate, itâ€™s going to be hard for someone like Milo to market himself through traditional channels. His potential audience is too spread out, and doesnâ€™t have that much in common. He canâ€™t advertise, he canâ€™t find them one by one. Itâ€™s just not going to scale.
But letâ€™s say he can acquire massive amounts of negative publicity by pissing off people in the media? Well now all of a sudden someone is absorbing the cost of this inefficient form of marketing for him.
(Emphasis mine.) Â That one’s adversariesÂ should be denied attention is not a new idea. Indeed, this is central to the “no-platforming” tactic. But no-platforming plays right intoÂ an outrage-based strategyÂ if it results in additionalÂ attentionÂ (see alsoÂ the Streisand effect).Â Worse, all the incentives for media makers are wrong.Â It’s going to be very hard for journalists and other media figures to wean themselves off of outrage, because strong emotional reactions get people to share informationÂ (1, 2, 3, etc.) and information sharing has become theÂ basis of distribution, which is the basis of revenue. We are in dire need ofÂ new business models for news.
ButÂ this breakdown of the mechanics of outrage marketing does suggest a counter-strategy: before you get mad, or report on someone getting mad, do your homework. Holiday called toÂ complain aboutÂ his own content, put out falseÂ press releases, etc. A smart journalistÂ might be able to uncoverÂ this deception. In a propaganda war, all journalists should be investigative journalists.
Attention is the currency of networked propaganda. Attention is the key. Be very careful who you give it to, and understand how your own emotions and incentives can be exploited.
6/ In order to manipulate the press, you have to be able to predict how it will behave. How many people have such insight? Uh... EVERYONE.— Jay Rosen (@jayrosen_nyu) January 13, 2017
But even if you’ve uncovered a deception,Â it’s not enough to say that someone else is lying. You have to tell a different story.
Debunking doesn’t work:Â provide an alternative narrative
Telling people that something they’ve heard is wrong may beÂ one of the most pointless things you can do. A long series of experiments shows that it rarely changes belief. Brendan Nyhan is one of the mainÂ scholars here, with a series of papers on political misinformation. This is aboutÂ human psychology; we simply don’t process information rationally, but instead employ a variety of heuristics and cognitive shortcuts (not necessarily maladaptive in general)Â that can be exploited. The classic experiment goes like this:
Participants in a study within this paradigm are told that there was a fire in a warehouse and that there were flammable chemicals in the warehouse that were improperly stored. When hearing these pieces of information in succession, people typically make a causal link between the two facts and infer that the fire was caused in some way by the flammable chemicals. Some subjects are then told that there were no flammable chemicals in the warehouse. Subjects who have received this corrective information may correctly answer that there were no flammable chemicals in the warehouse and separately incorrectly answer that flammable chemicals caused the fire. This seeming contradiction can be explained by the fact that people update the factual information about the presence of flammable chemicals without also updating the causal inferences that followed from the incorrect information they initially received.
Worse, repeating a lie in the process of refuting it may actuallyÂ reinforce it! The counter strategy is to replaceÂ one narrative with another. Affirm, don’t deny:
Which of these headlines strikes you as the most persuasive:
â€œI am not a Muslim, Obama says.â€
â€œI am a Christian, Obama says.â€
The first headline is a direct and unequivocal denial of a piece of misinformation thatâ€™s had a frustratingly long life. Itâ€™s Obama directly addressing the falsehood.
The second option takes a different approach by affirming Obamaâ€™s true religion, rather than denying the incorrect one. Heâ€™s asserting, not correcting.
Which one is better at convincing people of Obamaâ€™s religion? According to recent research into political misinformation, itâ€™s likely the latter.
The role of intelligence: Action not reaction
Let’s return to China for a moment. Here’s a chart, from the paper above, on the number of government social media postings over time:
Posts spiked around political events (CCP Congress) and emergencies that the government would rather citizens not talk about, such as riots and a rail explosion. This “cheerleading” propaganda wasn’t simply a regular diet of good news, but a precisely controlled strategy designed to drown out undesirable narratives.
One of the problems of a free press is that “the media” is a herd of cats. There really is no central authority — independence and diversity, huzzah! Similarly, distributed protest movements like Anonymous can be very effective for certain types of activities. But even Anonymous hadÂ central figures planning operations.
The most successful propagandists, like the most successful protest movements, are veryÂ organized. (Lost in the current “diversity of tactics” rhetoric is the historical fact that key battles in theÂ civil rights movement wereÂ carefully planned.) Organization and planning requires intelligence. You have to know who your adversariesÂ are and what they are doing. Intelligence involves basic steps like:
- Pay attention to the detailsÂ of every encounter. Who wrote that story or posted that comment?
- Research the actors and their networks. WhoÂ are they connected to? What communication channels do they use to coordinate? Who directs operations?
- Real-time monitoring. When a misinformation campaign begins, you need to get to your audience before they do (with something more than just a debunk, as above.)
Although there may be useful technological approaches to tracing networks, there is no magic here; anyone can keep a spreadsheetÂ of actors, you can do real-time monitoring with little more than Tweetdeck, and investigative journalists already know how to investigate. But centralization may be important. The RussianÂ approach of “many messages, many channels” suggests that anÂ open, diverse network canÂ succeed atÂ individual propaganda actions, and I bet it would succeed at counter-propaganda actions too. But intelligence is different, and it’sÂ an unansweredÂ question whether the messy collection of journalists, NGOs, universities, and activists in a free society canÂ do effective counter-propaganda intelligence, or even agree sufficiently on what that would be. I don’t think a distributed approachÂ will work here; someone needs to own the database and run the show.
Update: The East StratCom Task Force seems to be exactly this sort of centralized actor for the EU.
But one way or another, you have know what yourÂ propagandist adversary is doing, in detail and in real-time.Â If you don’t have that critical function taken care of, you’re going to be forever reactive, which means you’re probablyÂ going to lose.
PS: Up your security game
Hacking and leaking — which is one of the more effective waysÂ to doxÂ someone — Â has become a propaganda tactic. If you don’t want to be on the wrong end ofÂ this, I recommend immediately doing the following easy things:
- Enable 2-step logins on your email and other importantÂ accounts.
- Learn to recognizeÂ phishing.
Stay safe out there, and good luck.