Monday, October 4, 2021

Throwing The (Face)Book At Them, Con't

Last night's 60 Minutes report on Facebook featuring corporate whistleblower Frances Haugen revealing that the company is a willing and integral part of the GOP disinformation machine is only shocking if you haven't been paying attention to Silly Con Valley techbros over the last decade.

She secretly copied tens of thousands of pages of Facebook internal research. She says evidence shows that the company is lying to the public about making significant progress against hate, violence and misinformation. One study she found, from this year, says, "we estimate that we may action as little as 3-5% of hate and about 6-tenths of 1% of V & I [violence and incitement] on Facebook despite being the best in the world at it."

Scott Pelley: To quote from another one of the documents you brought out, "We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world."

Frances Haugen: When we live in an information environment that is full of angry, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other, the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.

'Ethnic violence' including Myanmar in 2018 when the military used Facebook to launch a genocide.

Frances Haugen told us she was recruited by Facebook in 2019. She says she agreed to take the job only if she could work against misinformation because she had lost a friend to online conspiracy theories.

Frances Haugen: I never wanted anyone to feel the pain that I had felt. And I had seen how high the stakes were in terms of making sure there was high quality information on Facebook.

At headquarters, she was assigned to Civic Integrity which worked on risks to elections including misinformation. But after this past election, there was a turning point.

Frances Haugen: They told us, "We're dissolving Civic Integrity." Like, they basically said, "Oh good, we made it through the election. There wasn't riots. We can get rid of Civic Integrity now." Fast forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, "I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous."

Facebook says the work of Civic Integrity was distributed to other units. Haugen told us the root of Facebook's problem is in a change that it made in 2018 to its algorithms—the programming that decides what you see on your Facebook news feed.

Frances Haugen: So, you know, you have your phone. You might see only 100 pieces of content if you sit and scroll on for, you know, five minutes. But Facebook has thousands of options it could show you.

The algorithm picks from those options based on the kind of content you've engaged with the most in the past.

Frances Haugen: And one of the consequences of how Facebook is picking out that content today is it is -- optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions.


Scott Pelley: Misinformation, angry content-- is enticing to people and keep--

Frances Haugen: Very enticing.

Scott Pelley:--keeps them on the platform.


Frances Haugen: Yes. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money.

Haugen says Facebook understood the danger to the 2020 Election. So, it turned on safety systems to reduce misinformation—but many of those changes, she says, were temporary.

Frances Haugen: And as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety.

And that really feels like a betrayal of democracy to me.

Facebook says some of the safety systems remained. But, after the election, Facebook was used by some to organize the January 6th insurrection. Prosecutors cite Facebook posts as evidence--photos of armed partisans and text including, "by bullet or ballot restoration of the republic is coming!" Extremists used many platforms, but Facebook is a recurring theme.

After the attack, Facebook employees raged on an internal message board copied by Haugen. "…Haven't we had enough time to figure out how to manage discourse without enabling violence?" We looked for positive comments and found this, "I don't think our leadership team ignores data, ignores dissent, ignores truth…" but that drew this reply, "welcome to Facebook! I see you just joined in November 2020… we have been watching… wishy-washy actions of company leadership for years now." "…Colleagues… cannot conscience working for a company that does not do more to mitigate the negative effects of its platform."


Scott Pelley: Facebook essentially amplifies the worst of human nature.
 
The profit motive for Facebook is to continue to spread GOP disinformation to willing consumers who eat up every last bit of it and spread it around. Facebook makes billions off of clicks, ads, and targeted posts. They don't care who gets hurt as long as they make money.
 
In effect, Facebook wants an American authoritarian propaganda state. They'd have a forced audience of hundreds of millions and would make hundreds of billions.

They have no reason to stop the GOP disinformation, and every reason to continue to amplify it.

No comments:

Post a Comment