Post-Truth Internet [...]

Yet gatekeepers would be in much less trouble without the second big factor in post-truth politics: the internet and the services it has spawned. Nearly two-thirds of adults in America now get news on social media and a fifth do so often, according to a recent survey by the Pew Research Centre, a polling outfit; the numbers continue to grow fast.

On Facebook, Reddit, Twitter or WhatsApp, anybody can be a publisher. Content no longer comes in fixed formats and in bundles, such as articles in a newspaper, that help establish provenance and set expectations; it can take any shape—a video, a chart, an animation. A single idea, or “meme”, can replicate shorn of all context, like DNA in a test tube. Data about the spread of a meme has become more important than whether it is based on facts.

Following Mr Oliver’s ideas about the increasing role of “magical thinking” on the American populist right, The Economist asked YouGov to look at different elements of magical thinking, including belief in conspiracies and a fear of terrible things, like a Zika outbreak or a terrorist attack, happening soon. Even after controlling for party identification, religion and age, there was a marked correlation with support for Mr Trump (see chart 2): 55% of voters who scored positively on our conspiracism index favoured him, compared with 45% of their less superstitious peers. These measures were not statistically significant predictors of support for Mitt Romney, the far more conventional Republican presidential candidate in 2012.
From fringe to forefront

Self-reinforcing online communities are not just a fringe phenomenon. Even opponents of TTIP, a transatlantic free-trade agreement, admit that the debate over it in Austria and Germany has verged on the hysterical, giving rise to outlandish scare stories—for instance that Europe would be flooded with American chickens treated with chlorine. “Battling TTIP myths sometimes feels like taking on Russian propaganda,” says an EU trade official.

The tendency of netizens to form self-contained groups is strengthened by what Eli Pariser, an internet activist, identified five years ago as the “filter bubble”. Back in 2011 he worried that Google’s search algorithms, which offer users personalised results according to what the system knows of their preferences and surfing behaviour, would keep people from coming across countervailing views. Facebook subsequently became a much better—or worse—example. Although Mark Zuckerberg, the firm’s founder, insists that his social network does not trap its users in their own world, its algorithms are designed to populate their news feeds with content similar to material they previously “liked”. So, for example, during the referendum campaign Leavers mostly saw pro-Brexit items; Remainers were served mainly pro-EU fare.

But though Facebook and other social media can filter news according to whether it conforms with users’ expectations, they are a poor filter of what is true. Filippo Menczer and his team at Indiana University used data from Emergent, a now defunct website, to see whether there are differences in popularity between articles containing “misinformation” and those containing “reliable information”. They found that the distribution in which both types of articles were shared on Facebook are very similar (see chart 3). “In other words, there is no advantage in being correct,” says Mr Menczer.

If Facebook does little to sort the wheat from the chaff, neither does the market. Online publications such as National Report, Huzlers and the World News Daily Report have found a profitable niche pumping out hoaxes, often based on long-circulating rumours or prejudices, in the hope that they will go viral and earn clicks. Newly discovered eyewitness accounts of Jesus’s miracles, a well-known ice-tea brand testing positive for urine, a “transgender woman” caught taking pictures of an underage girl in the bathroom of a department store—anything goes in this parallel news world. Many share such content without even thinking twice, let alone checking to determine if it is true.

Weakened by shrinking audiences and advertising revenues, and trying to keep up online, mainstream media have become part of the problem. “Too often news organisations play a major role in propagating hoaxes, false claims, questionable rumours and dubious viral content, thereby polluting the digital information stream,” writes Craig Silverman, now the editor of BuzzFeed Canada, in a study for the Tow Centre for Digital Journalism at the Columbia Journalism School. It does not help that the tools to keep track of and even predict the links most clicked on are getting ever better. In fact, this helps explain why Mr Trump has been getting so much coverage, says Matt Hindman of George Washington University.

Equally important, ecosystems of political online publications have emerged on Facebook—both on the left and the right. Pages such as Occupy Democrats and Make America Great can have millions of fans. They pander mostly to the converted, but in these echo chambers narratives can form before they make it into the wider political world. They have helped build support for both Bernie Sanders and Mr Trump, but it is the latter’s campaign, friendly media outlets and political surrogates that are masters at exploiting social media and its mechanisms.

A case in point is the recent speculation about the health of Mrs Clinton. It started with videos purporting to show Mrs Clinton suffering from seizures, which garnered millions of views online. Breitbart News, an “alt-right” web publisher that gleefully supports Mr Trump—Stephen Bannon, the site’s boss, took over as the Trump campaign’s “chief executive officer” last month—picked up the story. “I’m not saying that, you know, she had a stroke or anything like that, but this is not the woman we’re used to seeing,” Mr Bannon said. Mr Trump mentioned Mrs Clinton’s health in a campaign speech. Rudy Giuliani, a former mayor of New York, urged people to look for videos on the internet that support the speculation. The Clinton campaign slammed what it calls “deranged conspiracy theories”, but doubts are spreading and the backfire effect is in full swing.

Such tactics would make Dmitry Kiselyov proud. “The age of neutral journalism has passed,” the Kremlin’s propagandist-in-chief recently said in an interview. “It is impossible because what you select from the huge sea of information is already subjective.” The Russian government and its media, such as Rossiya Segodnya, an international news agency run by Mr Kiselyov, produce a steady stream of falsehoods, much like fake-news sites in the West. The Kremlin deploys armies of “trolls” to fight on its behalf in Western comment sections and Twitter feeds (see article). Its minions have set up thousands of social-media “bots” and other spamming weapons to drown out other content.
“Information glut is the new censorship,” says Zeynep Tufekci of the University of North Carolina, adding that other governments are now employing similar tactics. China’s authorities, for instance, do not try to censor everything they do not like on social media, but often flood the networks with distracting information. Similarly, in post-coup Turkey the number of dubious posts and tweets has increased sharply. “Even I can no longer really tell what is happening in parts of Turkey,” says Ms Tufekci, who was born in the country.

This plurality of voices is not in itself a bad thing. Vibrant social media are often a power for good, allowing information to spread that would otherwise be bottled up. In Brazil and Malaysia social media have been the conduit for truth about a corruption scandal involving Petrobras, the state oil company, and the looting of 1MDB, a state-owned investment fund. And there are ways to tell good information from bad. Fact-checking sites are multiplying, and not just in America: there are now nearly 100, according to the Reporters’ Lab at Duke University. Social media have started to police their platforms more heavily: Facebook recently changed the algorithm that decides what users see in their newsfeeds to filter out more clickbait. Technology will improve: Mr Menczer and his team at Indiana University are building tools that can, among other things, detect whether a bot is behind a Twitter account.

The truth is out there

The effectiveness of such tools, the use of such filters and the impact of such sites depends on people making the effort to seek them out and use them. And the nature of the problem—that the post-truth strategy works because it allows people to forgo critical thinking in favour of having their feelings reinforced by soundbite truthiness—suggests that such effort may not be forthcoming. The alternative is to take the power out of users’ hands and recreate the gatekeepers of old. “We need to increase the reputational consequences and change the incentives for making false statements,” says Mr Nyhan of Dartmouth College. “Right now, it pays to be outrageous, but not to be truthful.”

But trying to do this would be a tall order for the cash-strapped remnants of old media. It is not always possible or appropriate for reporters to opine as to what is true or not, as opposed to reporting what is said by others. The courage to name and shame chronic liars—and stop giving them a stage—is hard to come by in a competitive marketplace the economic basis of which is crumbling. Gatekeeping power will always bring with it a temptation for abuse—and it will take a long time for people to come to believe that temptation can be resisted even if it is.

But if old media will be hard put to get a new grip on the gates, the new ones that have emerged so far do not inspire much confidence as an alternative. Facebook (which now has more than 1.7 billion monthly users worldwide) and other social networks do not see themselves as media companies, which implies a degree of journalistic responsibility, but as tech firms powered by algorithms. And putting artificial intelligence in charge may be a recipe for disaster: when Facebook recently moved to automate its “trending” news section, it promoted a fake news story which claimed that Fox News had fired an anchor, Megyn Kelly, for being a “traitor”.

And then there is Mr Trump, whose Twitter following of over 11m makes him a gatekeeper of a sort in his own right. His moment of truth may well come on election day; the odds are that he will lose. If he does so, however, he will probably claim that the election was rigged—thus undermining democracy yet further. And although his campaign denies it, reports have multiplied recently that he is thinking about creating a “mini-media conglomerate”, a cross of Fox and Breitbart News, to make money from the political base he has created. Whatever Mr Trump comes up with next, with or without him in the White House, post-truth politics will be with us for some time to come.

“You can just say anything. Create realities.”
In such creation it helps to keep in mind—as Mr Putin surely does—that humans do not naturally seek truth. In fact, as plenty of research shows, they tend to avoid it. People instinctively accept information to which they are exposed and must work actively to resist believing falsehoods; they tend to think that familiar information is true; and they cherry-pick data to support their existing views. At the root of all these biases seems to be what Daniel Kahneman, a Nobel-prizewinning psychologist and author of a bestselling book, “Thinking, Fast and Slow”, calls “cognitive ease”: humans have a tendency to steer clear of facts that would force their brains to work harder.

In some cases confronting people with correcting facts even strengthens their beliefs, a phenomenon Brendan Nyhan and Jason Reifler, now of Dartmouth College and the University of Exeter, respectively, call the “backfire effect”. In a study in 2010 they randomly presented participants either with newspaper articles which supported widespread misconceptions about certain issues, such as the “fact” that America had found weapons of mass destruction in Iraq, or articles including a correction. Subjects in both groups were then asked how strongly they agreed with the misperception that Saddam Hussein had such weapons immediately before the war, but was able to hide or destroy them before American forces arrived.

As might be expected, liberals who had seen the correction were more likely to disagree than liberals who had not seen the correction. But conservatives who had seen the correction were even more convinced that Iraq had weapons of mass destruction. Further studies are needed, Mr Nyhan and Mr Reifler say, to see whether conservatives are indeed more prone to the backfire effect.

Given such biases, it is somewhat surprising that people can ever agree on facts, particularly in politics. But many societies have developed institutions which allow some level of consensus over what is true: schools, science, the legal system, the media. This truth-producing infrastructure, though, is never close to perfect: it can establish as truth things for which there is little or no evidence; it is constantly prey to abuse by those to whom it grants privileges; and, crucially, it is slow to build but may be quick to break.

Wikity users can copy this article to their own site for editing, annotation, or safekeeping. If you like this article, please help us out by copying and hosting it.

Destination site (your site)
Posted on Categories Uncategorized