The Facebook Papers ‘Greatest Hits’
In light of revelations in the media over the past few days, it’s not crazy-talk to say Facebook is more evil than the Sackler family when their minions pushed Oxycontin as a less addictive pain med, Ford when it made the Pinto or RJ Reynolds when it made cigarettes.
Those companies pushed their wares on subsets of consumers; Facebook is poisoning the world. It just is. And getting 3 billion people off that teat --despite brave talk from the peanut gallery-- isn’t going to be easy.
Seventeen news organizations, tasking dozens of journalists to examine thousands of documents, and a whistleblower’s damning testimony have all added up to paint a grim picture of this social media platform.
A series of whistleblower complaints filed with the U.S. Securities and Exchange Commission by former Facebook product manager Frances Haugen (and now others) are loaded with bombshells.
Forget what you thought you knew about Facebook; there is a beast hiding behind the nudges to “friend” people.
***
Collectively this effort has been dubbed “The Facebook Papers,” a reference to the Pentagon’s leaked history of the Vietnam War.
They show a company in the clutches of right-wing extremists, with a distorted view of the world allowing insurrectionists, racists and disinformation artists in the door under the guise of free speech. Except when profits are threatened.
Company CEO Mark Zuckerberg told Congress that the company removes 94% of hate speech it finds before a human reports it. The documents reveal company researchers estimated the number was more like 5%. He also lied when telling Congress that it was “not at all clear” that social networks polarize people, when Facebook’s own researchers had repeatedly found that they do.
The aura of being part of a benign tech company has allowed Facebook to ignore the staff’s ‘house on fire’ warnings, mislead regulators and their own oversight board. Over and over again evidence is presented internally about the harm and devastating impacts of the social media giant, and nothing happens.
In India, Myanmar, Sri Lanka, and Ethiopia, Facebook enabled the spread of disinformation, suggested visiting violent pages, and cultivated conspiracists. Human beings died by the thousands in uprisings, terror attacks, and government repression. Employees knew what was happening, tried to alert higher ups, and were ignored.
As recently as last week, CNN was able to find posts on Facebook-owned Instagram including photos of guns, and photo and video posts in which people appear to have been shot or beheaded by a violent Mexican drug cartel, known as Cartél Jalisco Nueva Generación.
Mark Zuckerberg controls a majority of Facebook’s stock, and what he says goes, even when it violates the company’s publicly stated principles.
According to documents seen by the Financial Times, Zuckerberg personally intervened to reinstate a false anti-abortion video removed by its moderators after Republican politicians including Texas Sen. Ted Cruz and Missouri Sen. Josh Hawley accused the company of censorship. The video falsely insisted that abortion is “never medically necessary,”a claim that could prove fatal to women with placenta previa and HELLP syndrome.
When it comes to overseas authoritarian governments, the Facebook CEO is even more pliable.
From the Washington Post:
Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets.
In America, the tech CEO is a champion of free speech, reluctant to remove even malicious and misleading content from the platform. But in Vietnam, upholding the free speech rights of people who question government leaders could have come with a significant cost in a country where the social network earns more than $1 billion in annual revenue, according to a 2018 estimate by Amnesty International.
So Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on the condition of anonymity to describe internal company discussions. Ahead of Vietnam’s party congress in January, Facebook significantly increased censorship of “anti-state” posts, giving the government near-total control over the platform, according to local activists and free speech advocates.
The company has done extensive research aimed at producing strategies on reducing polarization, conspiracy theories, and incitements to violence. The tools to make things right were there but executives often did not implement these methods.
A smattering of protective measures put in place in the run up to the 2020 election were rolled back, allowing insurrectionists to spread the Big Lie to millions of people.
From a different article in the Washington Post:
Relief flowed through Facebook in the days after the 2020 presidential election. The company had cracked down on misinformation, foreign interference and hate speech — and employees believed they had largely succeeded in limiting problems that, four years earlier, had brought on perhaps the most serious crisis in Facebook’s scandal-plagued history.
“It was like we could take a victory lap,” said a former employee, one of many who spoke for this story on the condition of anonymity to describe sensitive matters. “There was a lot of the feeling of high-fiving in the office.”
Many who had worked on the election, exhausted from months of unrelenting toil, took leaves of absence or moved on to other jobs. Facebook rolled back many of the dozens of election-season measures that it had used to suppress hateful, deceptive content. A ban the company had imposed on the original Stop the Steal group stopped short of addressing dozens of look-alikes that popped up in what an internal Facebook after-action report called “coordinated” and “meteoric” growth. Meanwhile, the company’s Civic Integrity team was largely disbanded by a management that had grown weary of the team’s criticisms of the company, according to former employees.
But the high-fives, it soon became clear, were premature.
Motivations for January 6 were spread on Facebook. From CNN:
When Facebook executives posted messages publicly and internally condemning the riot, some employees pushed back, even suggesting Facebook might have had some culpability.
"There were dozens of Stop the Steal groups active up until yesterday, and I doubt they minced words about their intentions," one employee wrote in response to a post from Mike Schroepfer, Facebook's chief technology officer.
Another wrote, "All due respect, but haven't we had enough time to figure out how to manage discourse without enabling violence? We've been fueling this fire for a long time and we shouldn't be surprised it's now out of control."
Other Facebook employees went further, claiming decisions by company leadership over the years had helped create the conditions that paved the way for an attack on the US Capitol.
During and after the election Facebook standards allowed political leaders to lie without facing the possibility of fact checks despite research showing that misinformation shared by politicians was more damaging than that coming from ordinary users.
Reporting in the Wall Street Journal reveals that efforts by employees to restrain the excesses of right wing publishers were quashed by more senior executives. Political considerations, as a result, overrode other factors in the company’s decision making. And somehow, these always favored the right’s information ecosystem.
To this day Facebook’s algorithms continue to boost extremist content, a daily listing of top posts is dominated by “a small gaggle of arch-conservative frothers known more for provocation than accuracy.”
(Note: Posts by UNICEF are boosted by a Facebook Covid-19 info panel. And does right wing radio host Dan Bongino really have that big of a national audience? Out of all Facebook’s users?)
Even more disgusting --yes, there is no bottom here-- is Facebook’s continued failures to stop human trafficking on its Instagram, and WhatsApp platforms. According to an internal paper published last year, the company knows its platforms enable “all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks."
The Apple store, responding to a BBC report showing how Facebook and its platforms foster domestic worker exploitation, threatened to remove Facebook and Instagram from its app store in 2019.
***
The whistleblower documents reveal the company’s greatest challenge, namely that its consumers are aging and younger people are going elsewhere for their social media experiences.
From the Associated Press:
Young adults engage with Facebook far less than their older cohorts, seeing it as an “outdated network” with “irrelevant content” that provides limited value for them, according to a November 2020 internal document. It is “boring, misleading and negative,” they say.
In other words, the young see Facebook as a place for old people.
Facebook’s user base has been aging faster, on average, than the general population, the company’s researchers found. Unless Facebook can find a way to turn this around, its population will continue to get older and young people will find even fewer reasons to sign on, threatening the monthly user figures that are essential to selling ads. Facebook says its products are still widely used by teens, although it acknowledges there’s “tough competition” from TikTok, Snapchat and the like.
***
Solving the problem.
Senator Elizabeth Warren’s proposal to break up the tech giants (not just Facebook) will probably get some commentary this week. And it’s totally true that the lack of antitrust enforcement in recent decades has been abysmal.
From the New York Times:
Matt Stoller, a fellow at the Open Markets Institute in Washington and a former senior adviser to the Senate Budget Committee, said Ms. Warren’s plan was “practical” and “necessary.” He compared big tech companies to the tobacco monopolies of America’s past, which were eventually subjected to antitrust lawsuits.
“There’s been a traditional sense around the politics of D.C. that these companies are progressive,” Mr. Stoller said. “Their employees give to Democrats, they’re friendly to social liberalism, there’s an idealism to how they talk about the world. That’s been the traditional sense.”
“But these companies have the moral frame of Big Tobacco,” he added. “They don’t care.”
Given that Congress can’t agree to spend a fraction of what we spend on the military to give us a fighting chance to cope with climate change, or ensure that the most basic needs of the people are met, I don’t see a new trust bustin’ bill going anywhere any time soon.
An antitrust suit against Facebook came back from the dead last summer, as a newly revived Federal Trade Commission refiled a case which was dismissed earlier.
The fundamental basis for antitrust changed during the Reagan years from “is it anticompetitive?” to “will it raise consumer prices?” This allows corporate lawyers to promise the moon when challenged on mergers while not holding the newly formed entity to more than a few vague assurances.
Trust busting is complicated and it can take many years to adjudicate. Most cases are settled, with the company involved paying a modest fine. The power in antitrust involves writing clauses into settlements that change corporate conduct. None-the-less, we are way overdue for breaking up our excessive number of monopolies and duopolies. And tech is a great place to start.
So here’s the deal. If a settlement with Facebook allowed you to take your data to another company (think switching cell phone plans), chances are Zuckerman would make his company more palatable.
(I know this is a short version of what needs to happen, but this piece is long enough already. And, yes, I am aware of the irony involved in my posting his story on my Facebook feed.)
Hey folks! Be sure to like/follow Words & Deeds on Facebook. If you’d like to have each post mailed to you check out the simple subscription form and the right side of the front page.
Email me at WritetoDougPorter@Gmail.com