Facebook is no stranger to the conflict of public relations issues, whistleblowers, and even the democratic procedure. Nevertheless, as of late it has been put under much various examination. One that has clarified what numerous have actually presumed for several years. Facebook has a program aside from to “give people the power to construct neighborhood and bring the world better together.”
It now seems like in addition to dropping the “It’s complimentary and constantly will be” slogan from its homepage, it has actually also ended up being clear that “Facebook deceived investors and the public about its function perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection.”– Facebook whistleblower Frances Haugen
Dripped files had started appearing in the Wall Street Journal and sensational observations began to capture the eyes of legislators all over the world.
The “Facebook Papers” nevertheless, and the numerous stories certainly still to come from their intro into the public world, touch on much deeper issues about Facebook as a whole. Facebook’s technique to combating hate speech and false information, managing global growth, safeguarding younger users on its platform, and even its capability to precisely determine the size of its enormous audience are all now put on serious blast.
As we see this massive company dodge and weave away such accusations, something stays really apparent. Facebook has actually ended up being too huge to fail! The question we have to ask is … are they actually efficient in handling the “real-world” harms from its staggeringly big platforms?
From Facebook – It must be rough when your own platform has so many negative things to embed.
I’ve spent the past several weeks reading the Facebook Papers, a gigantic collection of internal documents from Facebook unlike anything I’ve encountered. A few observations:
— Adrienne LaFrance (@AdrienneLaF) October 25, 2021
Facebook attempts to turn the page
Facebook, for its part, has repeatedly tried to reject Haugen and said her statement and reports on the documents mischaracterize its efforts and actions.
“At the heart of these stories is a premise which is false,” a Facebook representative stated in a declaration to CNN. “Yes, we’re an organization and we make a profit, but the idea that we do so at the cost of individuals’s security or wellness misinterprets where our own business interests lie.”
In a tweet thread recently, the business’s Vice President of Communications, John Pinette, called the Facebook Documents a “curated choice out of millions of files at Facebook” which “can in no chance be used to draw reasonable conclusions about us.” However even that response is telling—- if Facebook has more files that would tell a fuller story, why not release them? (During her Senate statement Facebook’s Davis stated Facebook is “looking for ways to launch more research.”).
A trove of internal Facebook files leaked by whistleblower Frances Haugen has started a wave of coverage of the company, starting with the Wall Street Journal’s “Facebook Files” and now as a consortium of other wire service present stories on the exact same files.
A chest of internal Facebook documents leaked by whistleblower Frances Haugen has actually started a wave of protection of the company, starting with the Wall Street Journal’s “Facebook Files” and now as a consortium of other news companies present stories on the same documents.
Instead, Facebook is now supposedly preparing to rebrand itself under a brand-new name as early as today, as the wave of crucial coverage continues. (Facebook formerly decreased to discuss this report.) The relocation appears to be a clear effort to turn the page, however a fresh coat of paint will not repair the underlying concerns outlined in the files– only Facebook, or whatever it may soon be called, can do that.
Take the example of a report released by the Journal on September 16 that highlighted internal Facebook research about a violent Mexican drug cartel, known as Cartél Jalisco Nueva Generación. The cartel was stated to be utilizing the platform to post violent content and hire brand-new members using the acronym “CJNG,” despite the fact that it had been designated internally as one of the “Unsafe People and Organizations” whose material must be gotten rid of. Facebook informed the Journal at the time that it was purchasing expert system to boost its enforcement versus such groups.
Despite the Journal’s report last month, CNN recently identified troubling material linked to the group on Instagram, including images of guns, and photo and video posts in which people appear to have been shot or beheaded. After CNN asked Facebook about the posts, a spokesperson confirmed that numerous videos CNN flagged were removed for breaching the business’s policies, and at least one post had a caution included.
Facebook knew it was being utilized to incite violence in Ethiopia. It did little to stop the spread, documents show.
Haugen has recommended Facebook’s failure to repair such issues is in part because it focuses on earnings over social excellent, and, in some cases, because the company does not have the capacity to put out its numerous fires at as soon as.
” Facebook is extremely thinly staffed … and this is due to the fact that there are a lot of technologists that take a look at what Facebook has done and their objection to accept duty, and individuals simply aren’t going to work there,” Haugen stated in a briefing with the “Facebook Documents” consortium recently. “So they have to make really, really, very deliberate options on what does or does not get achieved.”.
Facebook has invested a total of $13 billion considering that 2016 to improve the safety of its platforms, according to the business representative. (By contrast, the company’s yearly earnings topped $85 billion in 2015 and its profit struck $29 billion.) The representative also stated Facebook has “40,000 individuals dealing with the security and security on our platform, consisting of 15,000 individuals who evaluate material in more than 70 languages operating in more than 20 areas all throughout the world to support our neighborhood.”.
” We have actually likewise taken down over 150 networks seeking to control public debate since 2017, and they have come from over 50 nations, with the majority coming from or focused beyond the United States,” the representative stated. “Our performance history reveals that we crackdown on abuse outside the US with the same intensity that we apply in the United States.”.
Still, the files suggest that the business has far more work to do to eliminate all of the many harms detailed in the files and to address the unintentional effects of Facebook’s extraordinary reach and integration into our every day lives.
How Facebook Is Trying To Keep Users.
Facebook executives just recently confessed that more youthful teens are deserting the site for more recent mobile messaging and social sharing apps, while a study from earlier this year discovered that the social media lost 11 million active users overall in the U.S. and Britain. Here are some options Facebook is thinking about to keep its existing users and recover those who have defected:.
Zuckerberg’s public claims frequently contravene internal research study.
Haugen recommendations Zuckerberg’s public declarations at least 20 times in her SEC complaints, asserting that the CEO’s special degree of control over Facebook forces him to bear supreme duty for a list of societal damages caused by the business’s relentless pursuit of growth.
The documents also reveal that Zuckerberg’s public declarations are often at odds with internal company findings.
Zuckerberg testified last year before Congress that the business gets rid of 94 percent of the hate speech it discovers. In internal files, researchers approximated that the business was eliminating less than 5 percent of all hate speech on Facebook.
Facebook spokeswoman Dani Lever rejected that Zuckerberg “makes decisions that trigger damage” and dismissed the findings, stating they are “based on selected files that are mischaracterized and without any context.”.
It isn’t clear whether the SEC is investigating Facebook or whether it would see sufficient product in the disclosures to call for an investigation of whether the business might have misinformed financiers. In a yearly report, the SEC said it got over 6,900 whistleblower suggestions in the fiscal year ending September 2020.
A number of securities law experts stated it would not be easy to prove misdeed.
” Regulators like tidy cases and they like where somebody is on tape doing something wrong,” stated Joshua Mitts, a securities law professor at Columbia University. Haugen’s accusations are hardly a “clean case,” he stated.
Facebook’s public relations chief recently stated Haugen’s disclosures were an “managed ‘gotcha’ campaign” directed by her public relations advisers.
” A curated selection out of countless files at Facebook can in no method be used to draw fair conclusions about us,” Facebook’s vice president for communications, John Pinette, said in a tweet ahead of the release of the Haugen disclosures.
” Internally, we share operate in progress and debate options. Not every recommendation withstands the examination we should apply to decisions impacting numerous individuals,” Pinette said.
Haugen has gotten assistance from public relations and knowledgeable attorneys advisers. A company run by Costs Burton, an Obama White House spokesperson, is handling media demands, and Haugen is represented by lawyers from Whistleblower Aid, a nonprofit company.
The disclosures made by Haugen’s attorneys highlight a roiling internal argument at Facebook at the very same time it has actually remained in a severe external spotlight, with congressional hearings, privacy investigations, antitrust suits, and other scrutiny by outsiders.
And the upheaval might show a larger risk than any external examination because Facebook relies for its success on being able to bring in and keep a few of the world’s leading software application engineers and technologists. If the business can’t bring in, retain and motivate talented workers, it could lose its capability to complete effectively, it said in its newest yearly report in January.
A Facebook employee wrote on an internal message board on Jan. 6: “We have been dealing with questions we can’t address from our buddies, household, and industry associates for many years. Recruiting, in particular, has actually gotten harder throughout the years as Facebook’s ethical track record continues to deteriorate (all while our technical credibility continues to increase).”.
Facebook said in a statement that 83 percent of its workers state they ‘d suggest it as a terrific place to work which it has actually employed more workers this year than in any previous year.
Causing ‘social-civil war’.
Another set of Haugen’s documents describes how the computer algorithm behind Facebook’s news feed– the formula that identifies what posts people see and in which order– resulted in unintended effects over months and years.
Facebook revealed that it would rewrite the algorithm in January 2018, saying it would highlight “meaningful social interactions” and offer more weight to remarks, reactions, and re-shares among pals, instead of posts from companies and brands.
By the next year, the changes had resounded throughout European politics.
Facebook was responsible for a “social-civil war” in online political discourse in Poland, the person stated, passing on an expression from conversations with political operatives there. (The Facebook employee does not call the political celebrations or the operatives included in the “social-civil war” or what concerns were at the forefront. Extremist political parties in different nations celebrated the method the new algorithm rewarded their “provocation strategies” for topics such as migration, the Facebook staff member wrote.
Studying the effect of the algorithm modification ended up being a top priority for lots of economists, statisticians, and others who operate at Facebook studying the platform, the files show. A research study posted internally in December 2019 said Facebook’s algorithms “are not neutral” but rather value content that will get a reaction, any response, with the outcome that “outrage and false information are most likely to be viral.”
“We understand that many things that produce engagement on our platform leave users divided and depressed,” composed the scientist, whose name was redacted.
Some securities law experts stated accusations like Haugen’s wouldn’t always set off an SEC investigation.
“Do they really go to the core of what the SEC is required to police?” asked Charles Clark, a previous assistant director of the SEC’s enforcement division, who stated parts of the allegations didn’t appear to plainly break securities law. “A few of what she’s complaining about is essential to Congress and is important to the world at big but isn’t truly tied to the required of the SEC.”
Clark included, nevertheless, that a person of Haugen’s accusations– that Facebook is potentially pumping up user counts and other metrics crucial to marketers– “is the kind of matter that the SEC has actually concentrated on for numerous years.”
Securities law professionals likewise don’t dismiss how the SEC might react. Harvey Pitt, a previous SEC chair, stated that he thinks Haugen’s claims are credible which the commission ought to examine whether Facebook fulfilled its legal responsibilities in making disclosures to financiers.
Even that reaction is telling—- if Facebook has more files that would tell a fuller story, why not launch them? (Throughout her Senate testimony Facebook’s Davis said Facebook is “looking for ways to release more research.”).
The move appears to be a clear attempt to turn the page, however a fresh coat of paint won’t fix the underlying problems detailed in the files– only Facebook, or whatever it may quickly be called, can do that.
The representative also said Facebook has “40,000 people working on the safety and security on our platform, consisting of 15,000 people who examine content in more than 70 languages working in more than 20 locations all throughout the world to support our community.”.
Facebook was responsible for a “social-civil war” in online political discourse in Poland, the individual stated, passing on a phrase from discussions with political operatives there.
There is so much more to come involving the Facebook papers, the whistleblower, and the public relations nightmare that now involves the integrity of the democracy of the United States of America. You can be certain that facebook is just too big to fail. They always find a way out of any trouble they seem to get into. I think we need to start looking at why that is.
Please leave a like and share if you found this article enjoyable.
Facebook froze as anti-vaccine comments swarmed users – Facebook in Big Tech Trouble
By DAVID KLEPPER and AMANDA SEITZ
WASHINGTON (AP) — In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook employees thought they had found a way to help.
By altering how posts about vaccines are ranked in people’s newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about COVID-19 vaccines and offer users posts from legitimate sources like the World Health Organization.
“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote, responding to the internal memo about the study.
Instead, Facebook shelved some suggestions from the study. Other changes weren’t made until April.
When another Facebook researcher suggested disabling comments on vaccine posts in March until the platform could do a better job of tackling anti-vaccine messages lurking in them, that proposal was ignored.
Critics say the reason Facebook was slow to take action on the ideas is simple: The tech giant worried it might impact the company’s profits.
“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”
In an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.
Facebook’s internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.
The trove of documents shows that in the midst of the COVID-19 pandemic, Facebook carefully investigated how its platforms spread misinformation about life-saving vaccines. They also reveal rank-and-file employees regularly suggested solutions for countering anti-vaccine content on the site, to no avail. The Wall Street Journal reported on some of Facebook’s efforts to deal with anti-vaccine comments last month.
Facebook’s response raises questions about whether the company prioritized controversy and division over the health of its users.
“These people are selling fear and outrage,” said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. “It is not a fluke. It is a business model.”
Typically, Facebook ranks posts by engagement — the total number of likes, dislikes, comments, and reshares. That ranking scheme may work well for innocuous subjects like recipes, dog photos, or the latest viral singalong. But Facebook’s own documents show that when it comes to divisive public health issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement, and doubt.
To study ways to reduce vaccine misinformation, Facebook researchers changed how posts are ranked for more than 6,000 users in the U.S., Mexico, Brazil, and the Philippines. Instead of seeing posts about vaccines that were chosen based on their popularity, these users saw posts selected for their trustworthiness.
The results were striking: a nearly 12% decrease in content that made claims debunked by fact-checkers and an 8% increase in content from authoritative public health organizations such as the WHO or U.S. Centers for Disease Control. Those users also had a 7% decrease in negative interactions on the site.
Employees at the company reacted to the study with exuberance, according to internal exchanges included in the whistleblower’s documents.
“Is there any reason we wouldn’t do this?” one Facebook employee wrote in response to an internal memo outlining how the platform could rein in anti-vaccine content.
Facebook said it did implement many of the study’s findings — but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.
In a statement, company spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation.”
The company also said it took time to consider and implement the changes.
Yet the need to act urgently couldn’t have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable — the elderly and sick. And public health officials were worried. Only 10% of the population had received their first dose of a COVID-19 vaccine. And a third of Americans were thinking about skipping the shot entirely, according to a poll from The Associated Press-NORC Center for Public Affairs Research.
Despite this, Facebook employees acknowledged they had “no idea” just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. But company research in February found that as much as 60% of the comments on vaccine posts were anti-vaccine or vaccine reluctant.
“That’s a huge problem and we need to fix it,” the presentation on March 9 read.
Even worse, company employees admitted they didn’t have a handle on catching those comments. And if they did, Facebook didn’t have a policy in place to take the comments down. The free-for-all was allowing users to swarm vaccine posts from news outlets or humanitarian organizations with negative comments about vaccines.
“Our ability to detect (vaccine hesitancy) in comments is bad in English — and basically non-existent elsewhere,” another internal memo posted on March 2 said.
Los Angeles resident Derek Beres, an author and fitness instructor, sees anti-vaccine content thrive in the comments every time he promotes immunizations on his accounts on Instagram, which is owned by Facebook. Last year, Beres began hosting a podcast with friends after they noticed conspiracy theories about COVID-19 and vaccines were swirling on the social media feeds of popular health and wellness influencers.
Earlier this year, when Beres posted a picture of himself receiving the COVID-19 shot, some on social media told him he would likely drop dead in six months’ time.
“The comments section is a dumpster fire for so many people,” Beres said.
Anti-vaccine comments on Facebook grew so bad that even as prominent public health agencies like UNICEF and the World Health Organization were urging people to take the vaccine, the organizations refused to use free advertising that Facebook had given them to promote inoculation, according to the documents.
Some Facebook employees had an idea. While the company worked to hammer out a plan to curb all the anti-vaccine sentiment in the comments, why not disable commenting on posts altogether?
“Very interested in your proposal to remove ALL in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote on March 2.
The suggestion went nowhere.
Instead, Facebook CEO Mark Zuckerberg announced on March 15 that the company would start labeling posts about vaccines that described them as safe.
The move allowed Facebook to continue to get high engagement — and ultimately profit — off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.
“They were trying to find ways to not reduce engagement but at the same time make it look like they were trying to make some moves toward cleaning up the problems that they caused,” he said.
It’s unrealistic to expect a multi-billion-dollar company like Facebook to voluntarily change a system that has proven to be so lucrative, said Dan Brahmy, CEO of Cyabra, an Israeli tech firm that analyzes social media networks and disinformation. Brahmy said government regulations may be the only thing that could force Facebook to act.Full Coverage: The Facebook Papers
“The reason they didn’t do it is because they didn’t have to,” Brahmy said. “If it hurts the bottom line, it’s undoable.”
Bipartisan legislation in the U.S. Senate would require social media platforms to give users the option of turning off algorithms tech companies use to organize individuals’ newsfeeds.
Sen. John Thune, R-South Dakota, a sponsor of the bill, asked Facebook whistleblower Haugen to describe the dangers of engagement-based ranking during her testimony before Congress earlier this month.
She said there are other ways of ranking content — for instance, by the quality of the source, or chronologically — that would serve users better. The reason Facebook won’t consider them, she said, is that they would reduce engagement.
“Facebook knows that when they pick out the content … we spend more time on their platform, they make more money,” Haugen said.
Haugen’s leaked documents also reveal that a relatively small number of Facebook’s anti-vaccine users are rewarded with big pageviews under the tech platform’s current ranking system.
Internal Facebook research presented on March 24 warned that most of the “problematic vaccine content” was coming from a handful of areas on the platform. In Facebook communities where vaccine distrust was highest, the report pegged 50% of anti-vaccine pageviews on just 111 — or .016% — of Facebook accounts.
“Top producers are mostly users serially posting (vaccine hesitancy) content to feed,” the research found.
On that same day, the Center for Countering Digital Hate published an analysis of social media posts that estimated just a dozen Facebook users were responsible for 73% of anti-vaccine posts on the site between February and March. It was a study that Facebook’s leaders in August told the public was “faulty,” despite the internal research published months before that confirmed a small number of accounts drive anti-vaccine sentiment.
Earlier this month, an AP-NORC poll found that most Americans blame social media companies, like Facebook, and their users for misinformation.
But Ahmed said Facebook shouldn’t just shoulder blame for that problem.
“Facebook has taken decisions which have led to people receiving misinformation which caused them to die,” Ahmed said. “At this point, there should be a murder investigation.”
Seitz reported from Columbus, Ohio.
See full coverage of the “The Facebook Papers” here: https://apnews.com/hub/the-facebook-papers
From our friends at: apnews.com