Huxley and Orwell
What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.
“The Molokaʻi creeper is among the eight Hawaiian birds that were officially declared extinct on Sept. 29. (Jeremy Snell/Bernice Pauahi Bishop Museum)” — Washington Post
There is perhaps no better place to witness what the culture of disinformation has already wrought in America than a Trump campaign rally.
Tony Willnow, a 34-year-old maintenance worker who had an American flag wrapped around his head, observed that Trump had won because he said things no other politician would say. When I asked him if it mattered whether those things were true, he thought for a moment before answering. “He tells you what you want to hear,” Willnow said. “And I don’t know if it’s true or not — but it sounds good, so fuck it.”
The political theorist Hannah Arendt once wrote that the most successful totalitarian leaders of the 20th century instilled in their followers “a mixture of gullibility and cynicism.” When they were lied to, they chose to believe it. When a lie was debunked, they claimed they’d known all along — and would then “admire the leaders for their superior tactical cleverness.” Over time, Arendt wrote, the onslaught of propaganda conditioned people to “believe everything and nothing, think that everything was possible and that nothing was true.”
Leaving the rally, I thought about Arendt, and the swaths of the country that are already gripped by the ethos she described. Should it prevail in 2020, the election’s legacy will be clear — not a choice between parties or candidates or policy platforms, but a referendum on reality itself.
“He tells you what you want to hear, and I don’t know if it’s true or not—but it sounds good, so fuck it.”
“In an ever-changing, incomprehensible world the masses had reached the point where they would, at the same time, believe everything and nothing, think that everything was possible and that nothing was true.” Hannah Arendt, The Origins of Totalitarianism, 1951
In a Facebook experiment published in Nature that was conducted on a whopping 61 million people, some randomly selected portion of this group received a neutral message to “go vote,” while others, also randomly selected, saw slightly more social version of the encouragement: small thumbnail pictures of a few of their friends who reported having voted were shown within the “go vote” pop-up.
The researchers measured that this slight tweak — completely within Facebook's control and conducted without the consent or notification of any of the millions of Facebook users — caused about 340,000 additional people to turn out to vote in the 2010 U.S. congressional elections.
(The true number may even be higher since the method of matching voter files to Facebook names only works for exact matches.)
That significant effect—from a one-time, single tweak—is more than four times the number of votes that determined that Donald Trump would be the winner of the 2016 election for presidency in the United States.
“There aren’t many comparisons in American history for Thursday’s press conference in which Donald Trump suggested that the coronavirus might be defeated by shining lights inside human beings or injecting people with disinfectant. But there is the song ‘Miracles’ by Insane Clown Posse.”
Selected passages and quotes from Ryan Mac and Craig Silverman’s outstanding piece in Buzzfeed News, Hurting People At Scale: Facebook’s Employees Reckon With The Social Network They’ve Built
On July 1, Max Wang, a Boston-based software engineer who was leaving Facebook after more than seven years, shared a video on the company’s internal discussion board that was meant to serve as a warning.
“I think Facebook is hurting people at scale,” he wrote in a note accompanying the video. “If you think so too, maybe give this a watch.”
Most employees on their way out of the “Mark Zuckerberg production” typically post photos of their company badges along with farewell notes thanking their colleagues. Wang opted for a clip of himself speaking directly to the camera. What followed was a 24-minute clear-eyed hammering of Facebook’s leadership and decision-making over the previous year.
Yaël Eisenstat, Facebook's former election ads integrity lead, said the employees’ concerns reflect her experience at the company, which she believes is on a dangerous path heading into the election.
“All of these steps are leading up to a situation where, come November, a portion of Facebook users will not trust the outcome of the election because they have been bombarded with messages on Facebook preparing them to not trust it,” she told BuzzFeed News.
She said the company’s policy team in Washington, DC, led by Joel Kaplan, sought to unduly influence decisions made by her team, and the company’s recent failure to take appropriate action on posts from President Trump shows employees are right to be upset and concerned.
“These were very clear examples that didn't just upset me, they upset Facebook’s employees, they upset the entire civil rights community, they upset Facebook’s advertisers. If you still refuse to listen to all those voices, then you're proving that your decision-making is being guided by some other voice,” she said.
Replying to Wang’s video and comments, Facebook’s head of artificial intelligence Yann LeCun wrote,
Other employees, like [engineer Dan Abramov], the engineer, have seized the moment to argue that Facebook has never been neutral, despite leadership’s repeated attempts to convince employees otherwise, and as such needed to make decisions to limit harm. Facebook has proactively taken down nudity, hate speech, and extremist content, while also encouraging people to participate in elections — an act that favors democracy, he wrote.
“As employees, we can’t entertain this illusion,” he said in his June 26 memo titled “Facebook Is Not Neutral.” “There is nothing neutral about connecting people together. It’s literally the opposite of the status quo.”
Zuckerberg seems to disagree. On June 5, he wrote that Facebook errs on the “side of free expression” and made a series of promises that his company would push for racial justice and fight for voter engagement.
The sentiment, while encouraging, arrived unaccompanied by any concrete plans. On Facebook’s internal discussion board, the replies rolled in.
Stelter was reacting to dismissive statements on Fox & Friends by William Bennet, former Secretary of Education in the Reagan administration, about the severity of the Coronavirus.
Bennett smugly stated,
At the time William Bennett made those statements — April 13, 2020 — 22,000 Americans had already died of COVID-19.
“Social media is a nuance destruction machine…”
The full quote, in response to a question about so-called “cancel culture”, was, “What I find a little discouraging is that it appears to me that social media is a nuance destruction machine, and I don’t think that’s helpful for a democracy.”
“I’m old enough to remember when the Internet wasn’t a group of five websites, each consisting of screenshots of text from the other four.”
“When we balance out what’s more important, speed or accuracy, it’s not even a close call. We should be expecting accuracy and adjusting our expectations in regards to speed.”
“Nothing educates like the virus.”