archiveauthordonateinfoloadingloginsearchstarsubscribe

Interview. A year away from the 2024 elections, Facebook whistleblower Frances Haugen speaks out about what progress, if any, we’ve made in regulating the social networks.

Prisoners of the algorithms

On the eve of an election season in America that promises to be more heated than ever and marked by “deep disinformation,” social media platforms are reversing course on the reporting of fake news. Instead of increasing surveillance following the alarm raised by interference in the last elections and the violent aftermath of 2021, YouTube, Facebook, and X (formerly Twitter) have announced drastic reductions in content moderation staff.

This is partly the result of a right-wing campaign against the “censorship of conservatives” and the alleged war on freedom of expression and opinion, a justification also invoked by Trump during the insurrection trials. The campaign has been amplified by Elon Musk, who, under the banner of free speech, has turned Twitter into a platform catering to the alt-right. While Fox News has been forced to moderate its tone due to costly defamation judgments, Trump, during a live interview with Tucker Carlson on X, was able to repeatedly make false accusations of widespread election fraud by Biden.

The issue of social media remains unresolved because, as Frances Haugen explains in this Zoom interview with the foreign press in Los Angeles, the platform model remains based on data mining and algorithms as engines of “addictive capitalism” that relies on modifying user behavior, which is instrumental in acquiring and trading personal data.

An electronic engineer with a degree from Harvard, Haugen embarked on a brilliant career immediately after her studies, starting with Google, where she collaborated on a patent for indexing search results, then moving to Yelp, Pinterest, and finally Facebook, where in 2019 she became responsible for the “civic integrity” program aimed at limiting misinformation on the platform.

She ended up, however, denouncing the program’s inadequacy, revealing thousands of internal documents to this effect, testifying before Congress, and writing a book, “The Power of One.” Haugen argues that the top priority remains optimizing traffic and profits. She said the platforms have hundreds of people whose only job is to get you to spend more minutes on their app. “We need to stop considering these technologies as neutral or objective. That’s what they would like us to believe. Technology is never neutral; every technology has an ideology.”

Where do we stand with regulating the platforms?

It’s really hard to overstate how important the Digital Services Act is in the European Union. And what’s interesting is that other countries are taking up laws of that form. It’s looking like the Canadian government will take up an online safety law that is similar to the Digital Services Act this year.

The Digital Services Act is very different from what I think you’d hear if you stopped the average American on the street and said, “OK, we’re going to fix social media, what are we going to do?” Facebook has spent really huge sums of money trying to set up a narrative that the only way you can solve these problems is content moderation, like censorship. Most people, I think, if you’d stop them on the streets would’ve said, “Hey, I don’t want us to pass a law. Because I believe in freedom of speech.”

The European Union came in from a very different angle and said, “Hey, the real problem here is actually power imbalances.” All the things that we’re complaining about are all downstream of the fact that we have this huge power imbalance where Facebook can see what’s going on and we can’t. So, at a minimum, they have to tell us if they know there’s a risk, they have to tell us how they’re going to try to reduce that risk, and they have to give us access to enough data that we know that they’re actually making progress on it.

In the case of the United States, I think the thing that’s changed is the Surgeon General came out and did an advisory on social media. That’s a really, really big deal. Just for context, there’s only been less than 15 certain general advisories since the 1960s. And they’re all things that we would consider to be kind of major moments in public health history. It’s things like cigarettes cause cancer, seat-belts save lives. Stuff that we kind of shrug our shoulders and say, “Duh,” today. But those were all issues that had ambiguity around them before the Surgeon General came out and said, “Hey, this is the period at the end of the sentence.” Within two to three years of those past declarations, usually there’s some kind of big movement that happens either at the state level or at the federal level.

Meanwhile, the platforms themselves aren’t standing still , as seen with the launch of Threads by Meta to compete with Twitter. 

When Mark Zuckerberg comes out and says, “We don’t want Threads to be about the news, we don’t want it to be about politics. We want it to be a happy place. We want to replace Twitter,” I think that should give us pause because political discussion is sometimes not pleasant. To give you an example of a product decision they’ve made on Threads, you can’t set Threads to just show you content from your friends. You have to just accept whatever Meta is going to give you. You have to put yourself in their hands and have the algorithm drive the experience.

One of the core things I try to explain in my book is why once we step away from having human scale social media systems where you see things because a human specifically said, “I want to put this in front of you,” and you move into having a computer make those choices for you, focus your attention, you absolutely have to have transparency. Because now whoever controls the algorithm controls the conversation, and I wish they would acknowledge more that they are intentionally doing an experience where they have even more power and control than they had before.

What’s your opinion of Mark Zuckerberg?

I have a huge amount of empathy for Mark Zuckerberg because he never got to grow up. I really think Mark has been done a disservice by the fact that he has been the unilateral holder of control of Facebook since he was 19 years old. Even compared to Larry and Sergei at Google … they may have unilateral control together, but they still have to negotiate, what should we do? What does this mean? How should we run the company?

In a case like Mark, it is an entire system that is set up to make Mark happy. The people who are around him are the people who make him feel secure, who validate that he is a misunderstood genius who just can’t get a break.

I get flack sometimes for saying that I think there should be a “Free Mark” movement, like there was a Free Britney movement. … I like to joke that if I ever get over the PTSD I got from writing this book and foolishly decide to write another book, I want to dedicate it to Mark and say, “Mark, I fully believe in your capacity for greatness and I will not stop until you go pursue it.” Because he has functionally limitless money. He’s a smart guy, he could go solve malaria and instead he launched Threads. What are we doing here?

What was the effect of TikTok on social media?

When TikTok came through, they showed there was another way to do it, which was if you had a really good algorithm, you didn’t actually need a social network, you didn’t need a social graph, you didn’t need people to know the people that they were getting stuff from. Because as long as you gave them high quality stuff that entertained them, people would come back for more. And so now it’s much easier to grow your system. …

Humans are really good at governing themselves at a certain scale. … We know how to govern the size of a conversation up to [a few tens of thousands of people]. If you do a million-person conversation, you have to use an algorithm. And the problem with algorithms is every single algorithm has a bias. … And unless you have mandated legal requirements around transparency, you are now at the mercy of whoever holds the algorithm. …

A big part about why I wrote this book is we need as a society to understand the systems that are opaque and systems that are transparent are very, very different systems. And that we have to update our laws to discuss the fact that that’s a power imbalance. And that if we’re going to live in a world where AI’s direct our attention, make choices for us, we as consumers should have rights to know what we’re consenting to.

So we’re not moving in the right direction?

In the book I try to talk about what values do we want to see exhibited in social media. Because right now we have an agnostic framework when it comes to these things. We say they’re private companies, they’re allowed to do whatever they want to do. We’re not recognizing that these are now the primary places that many people get their socialization. … And yet we let these private companies that are controlled sometimes by a single person run these vital pieces of social infrastructure. … As we put more and more of our economy, more and more of our society in the hands of computers to direct, what are the checks and balances that go alongside that? We will only get more economic concentration, we’ll only get more concentration of power unless we say human beings matter. And we need to see technology acknowledge that and honor that.

Subscribe to our newsletter

Your weekly briefing of progressive news.

You have Successfully Subscribed!