avatar
How social media supercharged the propaganda system


In their book “Manufacturing Consent,” the late Ed Herman and professor Noam Chomsky described how a privately owned free press could function as a propaganda system that deceived its readers quite as efficiently as a heavy-handed government censor.

In their propaganda model, information about the world had to pass through a series of filters before reaching the media’s audiences. These filters prevented dangerous ideas — like democracy, equality, and peace — from reaching the readers of mass media. They identified five of those filters: Concentrated media ownership helped ensure that media reflected the will of its wealthy, corporate owners; reliance on official sources forced journalists and editors to make compromises with the powerful to ensure continued access; shared ideological premises, including the hatred of official enemies, biased coverage toward the support of war; the advertising business model filtered out information that advertisers didn’t like; and an organized “flak” machine punished journalists who stepped out of line, threatening their careers.

When Herman and Chomsky created the propaganda model in the 1980s, they wrote about newspapers — what we now patronizingly call “legacy media.” The “legacy media” still wield influence, but things have evolved far beyond the five “filters” they identified: ownership, official sources, ideology, advertising revenue, and flak. In our media environment, these five filters have become supercharged. And new filters have refined propaganda into something more like mind control.

The supercharging of existing media filters

Ownership of media outlets is now supercharged and superconcentrated. It’s not the four or five media companies, but Big Tech that determines what you see. And Big Tech is even more concentrated: it’s Google (which owns YouTube) and Facebook (which owns WhatsApp and Instagram). The generous can give honorable mention to Twitter, with its few hundred million users (which dwarf the reach the “legacy media” had). In recent years tech billionaires have bought media companies too, such as the Washington Post (owned by Jeff Bezos of Amazon), the Intercept (Pierre Omidyar of eBay), Time magazine (Marc Benioff of Salesforce), and the Atlantic (Laurene Powell Jobs of Apple).

Official Sources: Relying on official sources and the compromises needed to maintain access to those has long been a force behind media self-censorship. Media companies like Fox News have staked their fortunes on Trump’s ability to draw audiences to their networks. They have made Trump the ultimate official source and the ultimate news story. This has reduced the range of issues down to those that cross Trump’s limited attention span and narrowed the spectrum of debate (for and against Trump’s often absurd positions on the topics of the moment).

Ideology: Herman and Chomsky wrote about Cold War and War on Terror ideologies, but today’s ideological filter is worse than ever. Anticommunism might not have the force it had in the 1980s, but the New Cold War means that associations with Russia can be made to the same political effect as they had then. We also continue to have to hear about the importance of endless war, the endless generosity of police, the undeserving poor, and most of the other key premises that undergirded the media in the 1980s.

Advertising revenue: The tech giants are advertising companies at their heart, and so all of the problems that came with the legacy media being driven by advertisers remain in the new environment. Two years ago a report out of Columbia University described the new business model of media, “the platform press,” in which technology platforms are the publishers of note, and these platforms “incentivize the spread of low-quality content over high-quality material.”

Beyond the boost to the propaganda system provided by the transition to a “platform press,” the new advertising ecosystem has led to an explosion of what could be called the fake internet: advertising companies can pay other companies for clicks; the production of content can be automated. Much of the internet, as writer Max Read puts it, is now “fake people with fake cookies and fake social-media accounts, fake-moving their fake cursors, fake-clicking on fake websites.” This provides the powerful with two distinct opportunities to mislead audiences: first, they can take advantage of the fake internet directly. Second, by posing as uniquely credible on an internet full of fakery, they can sell more sophisticated or subtle falsehoods.

Flak has become supercharged to the point where organized hate machines can be created and deployed against anyone at the drop of a hat, creating immense psychological pressure to silence independent voices. In November 2018, Indian student activist Shehla Rashid wrote devastatingly about both the organization of hate on Twitter and the effect it has on her:

“The hate that I get from pro-BJP accounts is organised. No sooner have I tweeted than hundreds of abusive, acerbic, mocking replies start appearing beneath — within 12 seconds, 17 seconds. It would be flattering if it weren’t scary. Also, there seems to be no way to avoid this. There is no method to the madness. Regardless of what I tweet, there is ‘instant abuse.’ It is not based on the content of what I write.”

This affects not just Rashid, but her followers on the social media platform: “If you want to genuinely engage with my post, you’ll think twice before replying to me, as it means that your day will be ruined by abusive trolls who will keep tagging you for hours or even days. You will find no support for me in the direct replies (except in the forms of retweets or favourites) and you’ll take whatever I say with a pinch of salt.”

Rashid feels stuck, as in an abusive relationship: “In times when electronic media has turned into a show of competitive bigotry, Twitter does provide activists like me with a platform to air our views. I have 427,400 followers on Twitter. This means that the trade-off between leaving Twitter and having a voice is too high. This points to a deeply abusive relationship that we have with Twitter. We have virtually been held hostage to its benefits.”

The new media filters

But the new environment has some powerful filters the old one didn’t. Here are three:

It’s brought to you by a cult: Earlier this year employees at Facebook described the ways in which the company’s performance review system, in which numerical ratings from colleagues are gathered by managers, leads to “cult-like” practices within the company. To get ahead in the company, employees must “act as though everything is fine and that we love working here,” even when they don’t. In authoritarian political systems, people must do what they’re told; in totalitarian systems, people must pretend to love the authority. Most corporations could be described as internally totalitarian, and so this may not be a “new” filter. But by recent reports, the most powerful social media corporation in the world is, internally, more totalitarian than most.

An opaque algorithm controls what you see: Many researchers have pointed out how social media algorithms work to boost conspiracy theories, move users to more extreme content and positions, confirm the biases of the searcher, and incentivize the outrageous and offensive. These proprietary algorithms, which determine what you see, cannot be viewed, reverse-engineered, or understood. The media platforms that use them do so without any accountability. On the other hand, savvy political operators with resources can game the algorithm by creating ecosystems of links and platforming one another. This has been done so effectively on YouTube that, as the report “Alternative Influence” notes, the top 10 results for the phrase “social justice” are “criticisms of social justice from reactionary channels.”

They have hacked your social brain: When you receive news on Facebook, even though it comes from a small number of corporate sources or advertisers, you are receiving it from your friends, and so it comes with additional trust that you never had in “legacy media.”

One of Facebook’s founders, Sean Parker, said that Facebook’s goal was to “consume as much of your time and conscious attention as possible,” and that it did so by giving users “a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you . . . more likes and comments.” The point was to create “a social-validation feedback loop . . . exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.”

If that were not enough, social media platforms can hack your mood directly. In 2014, it was revealed that Facebook researchers had done a study on users, manipulating their moods, to see if they could. That case was terrifying, and has long been forgotten. Repeated academic studies show that social media use is harmful to mood and body image. Reducing its use can help with mental health. That is why upper-level social media executives neither use, nor allow their children to use their own platforms.

In the face of the propaganda system, Chomsky once famously advocated for a course of “intellectual self-defense,” which of necessity would involve working with others to develop an independent mind. Because the new propaganda system uses your social instincts and your social ties against you, “intellectual self-defense” today will require some measures of “social self-defense” as well. If Big Tech executives can unplug themselves and develop their “real-world” selves, those of us who hope to resist should probably do the same.

This article was produced by the Independent Media Institute.

Justin Podur is a Toronto-based writer. You can find him on his website at podur.org and on Twitter @justinpodur. He teaches at York University in the Faculty of Environmental Studies.

1 comment

  1. avatar
    James February 3, 2019 8:56 am 

    Good little morsel of insight and thanks for the AIN link…an idea that’s been floating around my own little head for a while now…

Leave a comment