Politics

From autofilling search terms to banning political ads, Big Tech companies have the ability to exert tremendous influence on politics. And they have. Big Tech exerted that power frequently ahead of the 2020 election to help then-former Vice President Joe Biden defeat President Donald Trump.

The most obvious example was Facebook and Twitter’s suppression of a New York Post article about Hunter Biden, Joe Biden and Ukrainian businessmen. Twitter used its “hacked materials” policy as an excuse to shut down a story damaging to the Biden campaign. It locked the Post’s account for more than two weeks and suspended Trump campaign officials and others for trying to share the story. 

Facebook manually suppressed the Hunter Biden story, preemptively limiting distribution while waiting for fact-checkers to weigh in. CEO Mark Zuckerberg told Congress, “Based on that risk, and in line with our existing policies and procedures, we made the decision to temporarily limit the content’s distribution while our fact-checkers had a chance to review it. When that didn’t happen, we lifted the demotion.”

It was far from the only example of political interference by Big Tech. The companies altered election-related policies 26 times in 2020. These policy changes included banning political ads, banning ads questioning the election outcome, censoring criticism of mail-in ballots and crackdowns on misinformation. Twitter also censored and otherwise limited Trump and his campaign Twitter account at least 583 times, but not once for Biden or his campaign.

  • Twitter censored, suppressed or limited Trump or his campaign’s account 583 times. But it never did any of those things to Biden or his campaign. By labeling many of those Trump tweets “disputed” Twitter also de-amplified them on the platform, preventing many people from seeing them.
  • Research psychologist Dr. Robert Epstein told the Media Research Center (MRC) in October 2020, Google is “now focusing most of their vote shifting power on the Senate races, where big-margin outcomes will be hard to contest.” His theory was Google’s control of autocomplete searches could “mobilize the base supporters of Democratic candidates to register to vote and then to vote; they can discourage some Republican voters from registering to vote or voting.” He wrote that the company had “at least 9 million undecided voters they can still play with.”
  • Facebook announced multiple policy changes roughly one month before the election. It said, “We also won’t allow ads with content that seeks to delegitimize the outcome of an election.” It also announced a ban on all political and issues-based advertising from the time the polls closed Nov. 3 for an undetermined period of time.
  • YouTube changed or adopted 10 election-related policies in 2020 saying, “[W]e’re continuing to raise up authoritative voices and reduce harmful misinformation,” wrote YouTube in a Sept. 24 blog post. The company stated that it would continue to remove misleading information and demonetize content that has “claims that could significantly undermine participation or trust in an electoral or democratic process.”
  • Voters who use social media are skeptical of what they read there. An October Rasmussen survey found 79% do not “believe most things” they read on social media. They also found only seven percent of likely voters felt Facebook, Twitter and similar platforms had a “good” impact on politics. Despite the claims of skepticism, Pew Research Center (Pew) found nearly a quarter of users changed their opinions because of something they saw on social media.
  • Twitter’s overzealous automated efforts to prevent election misinformation censored accurate and unrelated information too.
  • A Pew study released in August found 73 percent of U.S. adults think social media companies censor political views. Broken down by political party, 90 percent of Republicans thought intentional political censorship was “somewhat” or “very” likely, compared to 59 percent of Democrats.
  • Twitter Moments coverage was five times more favorable towards the Democratic National Convention than the Republican National Convention.
  • “Facebook conducted what they called ‘massive scale contagion experiments. How do we use subliminal cues on the Facebook pages to get more people to vote in the midterm elections? And they discovered that they were able to do it,” Dr. Shoshana Zuboff, Professor Emeritus at Harvard Business School said in Netflix’s “The Social Dilemma.”