(by Michael Nunez, Gizmodo.com) – [In April], Facebook CEO Mark Zuckerberg appeared to publicly denounce the political positions of Donald Trump’s presidential campaign during the keynote speech of the company’s annual F8 developer conference.
“I hear fearful voices calling for building walls and distancing people they label as ‘others,’” Zuckerberg said, never referring to Trump by name. “I hear them calling for…slowing immigration [and] for reducing trade….”
For a developer’s conference, the comments were unprecedented – a signal that the 31-year-old billionaire is quite willing to publicly mix politics and business. Zuckerberg has donated to campaigns in the past, but has been vague about which candidates he and his company’s political action committee* support. [*A political action committee is an organization that raises money privately to influence elections or legislation, especially at the federal level. Facebook has a political action committee.]
Inside Facebook, the political discussion has been more explicit. Last month, some Facebook employees used a company poll to ask Zuckerberg whether the company should try “to help prevent President Trump in 2017.”
Every week, Facebook employees vote in an internal poll on what they want to ask Zuckerberg in an upcoming Q&A session. A question from the March 4 employee poll was: “What responsibility does Facebook have to help prevent President Trump in 2017?”
A screenshot of the poll, given to Gizmodo, shows the question as the fifth most popular.
It’s not particularly surprising the question was asked, or that some Facebook employees are anti-Trump. The question and Zuckerberg’s statements on Tuesday align with the consensus politics of Silicon Valley: pro-illegal immigration, pro-trade, pro-expansion of the internet.
But what’s exceedingly important about this question being raised – and Zuckerberg’s answer, if there is one – is how Facebook now treats the powerful place it holds in the world. It’s unprecedented. More than 1.04 billion people use Facebook. It’s where [adults get their] news, share [their] political views, and interact with politicians. It’s also where those politicians are spending a greater share of their budgets.
And Facebook has no legal responsibility to give an unfiltered, [unbiased] view of what’s happening on their network.
“Facebook can promote or block any material that it wants,” UCLA law professor Eugene Volokh told Gizmodo. “Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want. They can block him or promote him.” But the New York Times isn’t hosting pages like Donald Trump for President or Donald Trump for President 2016, the way Facebook is.
Most people don’t see Facebook as a media company – an outlet designed to inform us. It doesn’t look like a newspaper, magazine, or news website. But if Facebook decides to tamper with its algorithm – altering what we see – it’s akin to an editor deciding what to run big with on the front page, or what to take a stand on. The difference is that readers of traditional media (including the web) can educate themselves about a media company’s political leanings. Media outlets often publish op-eds and editorials, and have a history of how they treat particular stories. Not to mention that Facebook has the potential to reach vastly, vastly more readers than any given publication.
With Facebook, we don’t know what we’re not seeing. We don’t know what the bias is or how that might be affecting how we see the world.
Facebook has toyed with skewing news in the past. During the 2012 presidential election, Facebook secretly tampered with 1.9 million user’s news feeds. The company also tampered with news feeds in 2010 during a 61-million-person experiment to see how Facebook could impact the real-world voting behavior of millions of people. An academic paper was published about the secret experiment, claiming that Facebook increased voter turnout by more than 340,000 people. In 2012, Facebook also deliberately experimented on its users’ emotions. The company, again, secretly tampered with the news feeds of 700,000 people and concluded that Facebook can basically make you feel whatever it wants you to.
If Facebook decided to, it could gradually remove any pro-Trump stories or media off its site – devastating for a campaign that runs on memes and publicity. Facebook wouldn’t have to disclose it was doing this, and would be protected by the First Amendment.
But would it be ethical?
…The only way that Facebook could legally overstep, experts say, is by colluding [conspiring] with a given candidate. “If Facebook was actively coordinating with the Sanders or Clinton campaign, and suppressing Donald Trump news, it would turn an independent expenditure (protected by the First Amendment) into a campaign contribution because it would be coordinated—and that could be restricted,” Volokh said.
“But if they’re just saying, ‘We don’t want Trump material on our site,’ they have every right to do that. It’s protected by the First Amendment.”
We’ve reached out to Facebook for comment and will update if we receive one.
Posted at Gizmodo .com on April 15, 2016. Reprinted here May 5, 2016 for educational purposes only.
UPDATE: On April 15, soon after the article above was published, Mr. Nunez added:
We reached out to Facebook before publishing our story and the company said they’d get back to us with an official comment—but never did. Instead, they gave their response to other publications including The Hill and Business Insider.
Here’s what Facebook told them:
“Voting is a core value of democracy and we believe that supporting civic participation is an important contribution we can make to the community. We encourage any and all candidates, groups, and voters to use our platform to share their views on the election and debate the issues. We as a company are neutral – we have not and will not use our products in a way that attempts to influence how people vote.”
Facebook’s comments to these publications don’t address our primary question – which is whether Mark Zuckerberg responded to the employee question, and what that response was. If Facebook gets back to us (or anyone else), we’ll update.
1. Define the following words as used in the commentary:
- unprecedented (paragraph 3)
- explicit (para. 4)
- unfiltered (para. 9)
- algorithm (para. 11)
- colluding (para. 16)
2. What is the main idea of this commentary?
3. For each of the following assertions made by Mr. Nunez, write agree or disagree and explain your answers:
- “Most people don’t see Facebook as a media company – an outlet designed to inform us. It doesn’t look like a newspaper, magazine, or news website. But if Facebook decides to tamper with its algorithm – altering what we see – it’s akin to an editor deciding what to run big with on the front page, or what to take a stand on. The difference is that readers of traditional media (including the web) can educate themselves about a media company’s political leanings. Media outlets often publish op-eds and editorials, and have a history of how they treat particular stories. Not to mention that Facebook has the potential to reach vastly, vastly more readers than any given publication.” (from para. 11)
- “With Facebook, we don’t know what we’re not seeing. We don’t know what the bias is or how that might be affecting how we see the world.” (from para. 12)
3. Re-read paragraph 13-14. Does this information cause you to change your view of Facebook? Why or why not? Explain your answer.
4. Conservative-libertarian commentator A.P. wrote:
Facebook is a private company and can suppress whatever speech they like…. Nothing’s stopping them from saying “no Trump material on our site.” There would…be a market backlash among Trump fans if Facebook did something like that – unless they secretly censored all (or most) of the Trump material without telling anyone. Imagine if Facebook quietly instituted a policy in which users could post pro-Trump material but only, say, 20 percent or so of their actual postings would be visible to “friends.” You’d never know that your posts weren’t being viewed, and the fact that you yourself would be able to view “some” pro-Trump posts by other people would convince you that nothing untoward is afoot. As Gizmodo notes, they’ve gamed people’s feeds before to try to tweak election results. Their reach is so fantastically huge that they could, in theory, tilt an election in one or more swing states simply by encouraging particular demographics to vote.
What do you think of this idea? (It’s possible, no way Mark Zuckerberg would do such a thing, even if FB did this, it would not sway people’s opinions, etc.) Explain your answer.
5. Ask a parent to answer the questions. Discuss your answers.