These Are the Most Damning Bits From ‘The Facebook Papers’ – Killing Democracy

These Are the Most Damning Bits From ‘The Facebook Papers’

Table of Contents


Facebook is no stranger to the confrontation of public relations issues, whistleblowers, and even the democratic procedure. Facebook has a program other than to “provide people the power to build neighborhood and bring the world better together.”

It now appears like together with dumping the “It’s complimentary and constantly will be” slogan from its homepage, it has also ended up being clear that “Facebook misinformed investors and the general public about its role perpetuating false information and violent extremism associating with the 2020 election and January 6th insurrection.”– Facebook whistleblower Frances Haugen

Leaked documents had begun appearing in the Wall Street Journal and sensational observations started to stand out of lawmakers worldwide.

The “Facebook Documents” however, and the numerous stories certainly still to come from their introduction into the public world, touch on much deeper concerns about Facebook as a whole. Facebook’s approach to battling hate speech and misinformation, managing worldwide development, securing younger users on its platform, and even its ability to precisely determine the size of its massive audience are all now put on severe blast.

As we watch this huge company evade and weave away such allegations, something remains very apparent. Facebook has actually ended up being too huge to fail! The question we have to ask is … are they actually capable of handling the “real-world” harms from its terribly large platforms?

From Facebook – It must be rough when your own platform has so many negative things to embed.

These Are the Most Damning Bits From ‘The Facebook Papers’ – Name Change Won’t Help

Erin Scott/Reuters

Erin Scott/Reuters

Thousands of internal documents are driving a virtual avalanche of damning news reports on what critics describe as the cruel, profit-focused machine that is social-media giant Facebook.

Now known as “The Facebook Papers,” the redacted documents, memos, presentations, internal discussion threads and charts were obtained by 17 news organizations, and include a slew of new revelations about the company’s internal decisions. They also paint a harsh portrait of reluctance to make changes that would address known issues, including the proliferation of harmful content and hate speech on the platform.

The documents, a combination of Securities and Exchange Commission (SEC) disclosures and leaked documents by way of whistleblower Frances Haugen, appear to have rattled the company. Among other responses, the brand is reportedly expected to soon announce a name change that critics say reflects efforts to circumvent responsibility for harm.

Spokesperson Andy Stone said that the stories painted a false picture of a company harming its users to make a profit.

“At the heart of these stories is a premise which is false,” he said in a statement. “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie.”

Here are some of the most damning allegations to emerge from the papers so far.

No One Is Holding Mark Zuckerberg Accountable, Facebook Whistleblower Tells Senate Panel

Hate Speech at the Core?

According to The New York Times, in an August 2019 internal memo, a group of Facebook researchers identified the company’s “core product mechanics” including features used to optimize engagement, as a “significant part” of why misinformation and hate speech flourished on the platform.

The “Like” and “Share” buttons could “serve to amplify bad content and sources,” another internal study in September 2020 showed, according to the documents. But despite those findings, Facebook CEO Mark Zuckerberg and other executives have largely shied away from changing the platform’s core features to prevent the proliferation of hate speech, although they did test hiding the Like button to “build a positive press narrative” around Instagram, amid separate findings about anxiety in teens, documents show.

That mentality—which Haugen, who testified before Congress, has described as putting “profit over safety”— also fueled internal discussions over the company’s lopsided approach to content moderation.

Facebook has pushed back on the criticism, arguing that the company invested $13 billion in safety and hired more than 40,000 workers focused on it, according to Stone.

Not All Countries Created Equal

According to The Verge, the documents reveal an internal system that split up the world’s countries into tiers that ultimately prioritized protecting users in some countries over those in others.

The countries that were deemed the highest priority and received the top network-monitoring resources included Brazil, India, and the United States, which belonged in what the company called “tier zero.” For these countries, Facebook built “war rooms” to monitor the network and alert local election officials about potential problems with false claims online.

Germany, Indonesia, Iran, Israel, and Italy were slotted into tier one, and were given similar resources, but with less enforcement.

Another 22 countries were grouped into tier two, with fewer resources. The remainder of the world’s countries were relegated to Facebook’s third tier, where the company would reportedly only intervene if election-related content was brought to its attention by moderators.

An absence of machine learning classifiers—specifically those built to recognize hate speech and misinformation in different languages —allowed posts that inspired violence in countries like Myanmar, Pakistan and Ethiopia, the outlet found.

Zuckerberg Accused of Caving to Vietnam Government

Zuckerberg opted to allow Vietnam’s ruling Communist Party to censor “anti-state” posts, effectively handing over control of the platform to the government, sources told The Washington Post. That decision was reportedly made after the Vietnamese government threatened to kick Facebook off its web.

Facebook said it made the move “to ensure our services remain available for millions of people who rely on them every day.”

About That Big Lie

According to Politico, the company fumbled on building a clear strategy for combating content aimed at delegitimizing election results in the United States. Which, with Trump’s Big Lie powering a post-election insurrection including the Jan.6 riot, proved to be a serious problem.

Many of the offending posts were flagged for containing “harmful non-violating narratives,” which did not break the company’s community rules, documents reviewed by the outlet showed. But employees expressed outrage on internal message boards over efforts by company leadership to bypass common-sense changes “to better serve people like the groups inciting violence,” on Jan. 6, according to one employee. “Rank and file workers have done their part to identify changes to improve our platform but have been actively held back.”

Human-Trafficking

According to CNN, internal Facebook communications described how women were trafficked on its platform, some of them enduring sexual abuse and kept from escaping while going without food or pay.

In 2018, Facebook employees flagged Instagram profiles that appeared to sell domestic laborers, but internal documents reviewed by the outlet from September 2019 showed little effort by the company to address the problem.

After a threat from Apple to remove the app from its store in 2019, Facebook made some efforts to remove the content, but the company continues to be plagued by domestic servitude content, the report found.

In February, internal researchers said in a report that labor-recruitment agencies communicated with victims via direct messages and that the social-media platform needed more “robust proactive detection methods” in order to prevent recruitment, CNN said.

In a letter to the United Nations on the subject last year, Facebook said it was working to develop technology to address “domestic servitude,” and also insisted that cases of servitude were “rarely reported to us by users.”

Read more at The Daily Beast.

Got a tip? Send it to The Daily Beast here

Get our top stories in your inbox every day. Sign up now!

Daily Beast Membership: Beast Inside goes deeper on the stories that matter to you. Learn more.

From our friends at: uk.news.yahoo.com

Facebook attempts to turn the page

Facebook, for its part, has actually consistently tried to reject Haugen and stated her statement and reports on the documents mischaracterize its actions and efforts.
“At the heart of these stories is a premise which is incorrect,” a Facebook representative said in a statement to CNN. “Yes, we’re a company and we make a revenue, but the concept that we do so at the cost of individuals’s security or health and wellbeing misunderstands where our own commercial interests lie.”
In a tweet thread last week, the company’s Vice President of Communications, John Pinette, called the Facebook Papers a “curated selection out of millions of documents at Facebook” which “can in no chance be used to draw fair conclusions about us.” Even that response is informing—- if Facebook has more documents that would inform a fuller story, why not launch them? (Throughout her Senate testimony Facebook’s Davis said Facebook is “searching for ways to release more research study.”).
A trove of internal Facebook documents dripped by whistleblower Frances Haugen has kicked off a wave of coverage of the company, beginning with the Wall Street Journal’s “Facebook Files” and now as a consortium of other news companies roll out stories on the same documents.
A chest of internal Facebook files dripped by whistleblower Frances Haugen has begun a wave of protection of the company, beginning with the Wall Street Journal’s “Facebook Files” and now as a consortium of other wire service roll out stories on the very same files.
Instead, Facebook is now apparently preparing to rebrand itself under a brand-new name as early as this week, as the wave of vital coverage continues. (Facebook previously declined to discuss this report.) The move appears to be a clear attempt to turn the page, but a fresh coat of paint will not fix the underlying issues laid out in the documents– only Facebook, or whatever it might soon be called, can do that.
Take the example of a report published by the Journal on September 16 that highlighted internal Facebook research study about a violent Mexican drug cartel, called Cartél Jalisco Nueva Generación. The cartel was said to be utilizing the platform to publish violent content and recruit new members using the acronym “CJNG,” even though it had been designated internally as one of the “Harmful Individuals and Organizations” whose content ought to be eliminated. Facebook informed the Journal at the time that it was investing in synthetic intelligence to bolster its enforcement against such groups.
In spite of the Journal’s report last month, CNN recently recognized disturbing material connected to the group on Instagram, consisting of pictures of weapons, and picture and video posts in which individuals appear to have been shot or beheaded. After CNN asked Facebook about the posts, a representative verified that several videos CNN flagged were eliminated for violating the business’s policies, and at least one post had a warning included.

Facebook knew it was being used to incite violence in Ethiopia. It did little to stop the spread, documents reveal.

Haugen has recommended Facebook’s failure to fix such issues is in part due to the fact that it focuses on earnings over societal excellent, and, in some cases, because the company does not have the capability to put out its lots of fires at once.
” Facebook is incredibly thinly staffed … and this is due to the fact that there are a lot of technologists that take a look at what Facebook has actually done and their objection to accept responsibility, and people just aren’t willing to work there,” Haugen stated in a briefing with the “Facebook Papers” consortium last week. “So they need to make very, extremely, really intentional choices on what does or does not get accomplished.”.
Facebook has actually invested an overall of $13 billion given that 2016 to improve the security of its platforms, according to the company representative. (By contrast, the company’s yearly income topped $85 billion in 2015 and its revenue hit $29 billion.) The representative likewise stated Facebook has “40,000 people working on the security and security on our platform, consisting of 15,000 individuals who examine content in more than 70 languages operating in more than 20 areas all across the world to support our community.”.
” We have actually also removed over 150 networks seeking to control public argument given that 2017, and they have actually come from over 50 countries, with the bulk originating from or focused beyond the US,” the representative stated. “Our performance history reveals that we crackdown on abuse outside the United States with the exact same intensity that we apply in the US.”.
Still, the documents suggest that the company has far more work to do to remove all of the numerous harms described in the documents and to deal with the unexpected repercussions of Facebook’s extraordinary reach and combination into our every day lives.

How Facebook Is Attempting To Retain Users.

Facebook executives just recently confessed that younger teenagers are deserting the website for newer mobile messaging and social sharing apps, while a study from previously this year found that the social media lost 11 million active users in general in the U.S. and Britain. Here are some choices Facebook is thinking about to keep its existing users and win back those who have actually defected:.

Zuckerberg’s public claims often contrast with internal research.

Haugen referrals Zuckerberg’s public statements at least 20 times in her SEC complaints, asserting that the CEO’s distinct degree of control over Facebook forces him to bear supreme responsibility for a litany of societal harms triggered by the business’s unrelenting pursuit of development.

The files also show that Zuckerberg’s public declarations are typically at odds with internal company findings.

Zuckerberg testified last year prior to Congress that the company gets rid of 94 percent of the hate speech it discovers. In internal files, researchers estimated that the business was eliminating less than 5 percent of all hate speech on Facebook.

Facebook spokeswoman Dani Lever denied that Zuckerberg “makes decisions that cause harm” and dismissed the findings, stating they are “based on chosen files that are mischaracterized and devoid of any context.”.

It isn’t clear whether the SEC is examining Facebook or whether it would see adequate material in the disclosures to require an examination of whether the company could have deceived investors. The SEC declined to comment. The commission isn’t required to take any action on whistleblowers’ suggestions, and when it performs examinations, it does so on a confidential basis as a matter of policy. In a yearly report, the SEC stated it got over 6,900 whistleblower suggestions in the ending September 2020.

Several securities law professionals said it wouldn’t be simple to prove misdeed.

” Regulators like tidy cases and they like where somebody is on tape doing something incorrect,” stated Joshua Mitts, a securities law teacher at Columbia University. Haugen’s accusations are hardly a “tidy case,” he said.

Facebook pushback.

Facebook’s public relations chief last week said Haugen’s disclosures were an “orchestrated ‘gotcha’ campaign” directed by her public relations advisors.

” A curated selection out of millions of documents at Facebook can in no other way be utilized to draw reasonable conclusions about us,” Facebook’s vice president for interactions, John Pinette, stated in a tweet ahead of the release of the Haugen disclosures.

” Internally, we share work in progress and argument alternatives. Not every idea withstands the analysis we should apply to choices affecting so many people,” Pinette said.

Haugen has gotten aid from experienced attorneys and public relations advisers. A company run by Bill Burton, an Obama White House spokesperson, is dealing with media demands, and Haugen is represented by attorneys from Whistleblower Aid, a not-for-profit company.

The disclosures made by Haugen’s attorneys show a roiling internal argument at Facebook at the exact same time it has actually been in an extreme external spotlight, with congressional hearings, privacy examinations, antitrust lawsuits, and other analysis by outsiders.

And the upheaval may show a bigger risk than any external analysis due to the fact that Facebook relies for its success on having the ability to draw in and keep some of the world’s leading software application engineers and technologists. If the business can’t attract, keep and inspire gifted workers, it might lose its capability to compete successfully, it said in its most current annual report in January.

A Facebook worker wrote on an internal message board on Jan. 6: “We have actually been handling concerns we can’t answer from our pals, household, and market associates for several years. Hiring, in particular, has actually gotten harder for many years as Facebook’s ethical credibility continues to deteriorate (all while our technical reputation continues to increase).”.

Facebook stated in a statement that 83 percent of its workers say they ‘d advise it as a fantastic place to work and that it has employed more employees this year than in any previous year.

Triggering ‘social-civil war’.
Another set of Haugen’s documents explains how the computer algorithm behind Facebook’s news feed– the formula that identifies what posts individuals see and in which order– resulted in unintentional effects over years and months.

Facebook announced that it would rewrite the algorithm in January 2018, saying it would emphasize “significant social interactions” and give more weight to comments, responses, and re-shares amongst pals, instead of posts from brands and organizations.

By the next year, the modifications had reverberated throughout European politics.

Facebook was accountable for a “social-civil war” in online political discourse in Poland, the person stated, passing on a phrase from discussions with political operatives there. (The Facebook worker doesn’t name the political parties or the operatives included in the “social-civil war” or what issues were at the leading edge. Extremist political celebrations in different countries commemorated the way the brand-new algorithm rewarded their “justification methods” for topics such as migration, the Facebook worker wrote.

Studying the effect of the algorithm change ended up being a concern for many financial experts, statisticians, and others who operate at Facebook studying the platform, the documents show. A research study posted internally in December 2019 said Facebook’s algorithms “are not neutral” however instead worth material that will get a reaction, any reaction, with the result that “outrage and false information are more most likely to be viral.”

” We understand that lots of things that generate engagement on our platform leave users divided and depressed,” composed the researcher, whose name was redacted.

Possible repercussions.
Some securities law experts said claims like Haugen’s would not always activate an SEC investigation.

” Do they actually go to the core of what the SEC is required to police?” asked Charles Clark, a former assistant director of the SEC’s enforcement division, who stated parts of the allegations didn’t appear to plainly breach securities law. “A few of what she’s complaining about is crucial to Congress and is necessary to the world at big however isn’t truly tied to the mandate of the SEC.”

Clark added, however, that a person of Haugen’s allegations– that Facebook is potentially inflating user counts and other metrics crucial to marketers– “is the type of matter that the SEC has actually focused on for several years.”

Securities law experts likewise do not eliminate how the SEC might respond. Harvey Pitt, a previous SEC chair, stated that he believes Haugen’s allegations are reliable which the commission should examine whether Facebook met its legal commitments in making disclosures to financiers.

Even that reaction is informing—- if Facebook has more files that would tell a fuller story, why not launch them? (Throughout her Senate statement Facebook’s Davis stated Facebook is “looking for ways to launch more research study.”).
The relocation appears to be a clear attempt to turn the page, however a fresh coat of paint won’t fix the underlying issues outlined in the documents– just Facebook, or whatever it might quickly be called, can do that.
The representative also stated Facebook has “40,000 people working on the security and security on our platform, consisting of 15,000 people who evaluate material in more than 70 languages working in more than 20 areas all throughout the world to support our community.”.

Facebook was responsible for a “social-civil war” in online political discourse in Poland, the individual stated, passing on an expression from conversations with political operatives there.

facebook papers

There is so much more to come involving the Facebook papers, the whistleblower, and the public relations nightmare that now involves the integrity of the democracy of the United States of America. You can be certain that facebook is just too big to fail. They always find a way out of any trouble they seem to get into. I think we need to start looking at why that is.

Please leave a like and share if you found this article enjoyable.

Leave a Reply

Your email address will not be published. Required fields are marked *

Share on facebook
Share on twitter
Share on pinterest
Share on reddit