Back in the last presidential campaign season, reporters on the tech and politics beats began noticing a rise in far-right memes that supported Trump. Memes being memes, these seemed initially like weird, off-color jokes. They wondered: What the hell is going on? Was this shitposting ironic or serious? Or both? Either way, it seemed newsworthy. The memes were climbing the trending lists on every social network and landing on the front page of Reddit. So journalists began filing what became a flood of stories about alt-right political memes and loopy conspiracy theories. Like Taylor Swift as a white nationalist icon and Pepe the Frog in a Nazi uniform.

“Surely if we expose this,” one tech journalist told herself, “it’ll put people off it.”

It didn’t. When, a year later, Trump had won and Nazis were openly marching, those reporters began to realize that their coverage had had precisely the opposite effect. It had helped bring white supremacy into the mainstream by giving it crucial exposure. They had wildly amplified the importance and reach of what were, in all likelihood, a not huge number of miscreants.“Every once in a while I’ll look back and see something that I wrote a year and a half ago and the pit of my stomach falls,” one reporter says.

Hate groups had gamed the media. They did it energetically and successfully. Now it may be time to invoke the wisdom of ­WarGames—where the only way to win is not to play.

That’s the conclusion of some fascinating research by Whitney Phillips, assistant professor of communications at Syracuse University and an expert on online trolling. She interviewed dozens of journalists who covered the meme wars—including the ones quoted above and WIRED’s Emma Grey Ellis—and mapped out how and why the nativist right got so good at hacking the media’s attention.

Related Stories

One early hoodwinking of the mainstream media occurred in 2008. The denizens of Anonymous on the 4chan forum were amusing themselves by making ironic jokes about pedophilia—they popularized an image of a cartoon named Pedobear and quipped about an army of “9,000 penises” (the number 9,000 being a shout-out to an anime series favored by many users of 4chan). Then a 4chan member experimentally tried to bring the joke mainstream—by posing as a member of a pedophile group on the discussion boards for The Oprah Winfrey Show.

Oprah fell for it. She warned about this “known pedophile network,” talking up Pedobear. The denizens of 4chan were thrilled, and as Phillips notes, they learned a lesson: “It’s culturally fun to screw with reporters.”

As the presidential election loomed, this yank-the-media’s-chain strategy was neatly integrated into the machinations of various hate groups online. Today’s white nationalists know their radioactive misogyny, racism, and anti-Semitism are likely to turn off “normies.” If you want to bring that stuff mainstream, they intuited, you need to be highly ironic. Leave it unclear whether you’re earnest.

“Generally, when using racial slurs, it should come across as half-joking,” wrote the editors of the Daily Stormer, a white supremacist site, in their style guide. “It should not come across as genuine raging vitriol. That is a turnoff to the overwhelming majority of people.”

Better yet, this was the great age of sock puppet and bot-farm technology too. So it was possible for white supremacists to give their memes and conspiracy theories artificially generated upvotes and retweets, helping them mount the “recommended” leaderboards of social media. That was part of the ploy: Make something seem so big that reporters would feel remiss in not pouncing on it.

One problem reporters faced was that, from the far-right’s point of view, any coverage was good—even coverage that rebutted or fact-checked their memes. They were following the script often attributed to PT Barnum and, not incidentally, Trump: All publicity is good publicity.

“That’s how Pizzagate got so big,” says danah boyd, a friend of mine who runs the Data & Society Institute, which commissioned Phillips’ study. Stories about Pizzagate—a conspiracy theory linking Hillary Clinton to a (nonexistent) pedophile ring supposedly run out of a pizzeria in Washington, DC—would prompt viewers to do online searches. “People who don’t trust the media see a story and think, well, I guess I’ll self-investigate.” And that leads them to the conspiracy theory sites, which they probably would not have found otherwise. The Nazis needed media outrage to amplify their message and bring it retail.

LEARN MORE

The WIRED Guide to Memes

So, what’s to be done? Covering online hate memes plays into their makers’ hands. But not covering them seems neglectful. Online far-right hate-mongers do exist.

One idea, boyd suggests, is what’s known as “strategic silence”: Be less wanton in the choice of details used in reporting. The concept comes from the world of suicide research, where scientists have reliably found that when journalists report on a prominent suicide in lavish detail—describing how someone took their own life, the contents of a note—it produces a spike of copycats. After Robin Williams killed himself and wall-to-wall media coverage ensued, the next month’s suicide rate jumped 10 percent. In contrast, studies show that when media coverage is less detailed, you don’t see these dramatic spikes.

That’s not a bad road map for how we talk about online memes. We—including myself—could use greater precision and less hyperbole. Phillips suggests we stop using the fungible phrase “trolls” when talking about hate groups: It both minimizes their genuinely vile goals and makes them seem like a huge horde. (Many meme campaigns are probably “three dudes and their bots,” as Phillips cracks.) Many of these campaigns were never organically big enough to warrant coverage. They were like the PR stunts that corporations stage to provoke coverage, which journalists rightly ignore.

Done with nuance, coverage of high-profile suicides can be a teachable moment. When Kate Spade and Anthony Bourdain took their own lives this June, many stories included suicide hotlines or links to get help. Maybe there’s a parallel in hate-meme coverage: links to groups that fight extremism or support victims.

Media restraint might slow the spread of extremist ideas, but it can’t entirely stop it. Social media means everyone’s a publisher now. This suggests we all could engage in strategic silence too, and less “OMG” meme-spreading.

Still, there’s a reason white nationalists tried to game the media: The “lamestream” still matters. In an ironic way, for traditional press, it’s an oddly validating message. The media just has to be sure not to validate back.


This article appears in the August issue. Subscribe now.


More Great WIRED Stories