Frances Haugen compares Facebook’s practices to those of Google and Twitter

Deborah Yao, Editor

March 15, 2022

3 Min Read

Frances Haugen compares Facebook’s practices to those of Google and Twitter

Frances Haugen made waves in 2021 when she stepped forward to reveal the extent to which Facebook allegedly prioritized profits over materially curbing misinformation on its platform. The former Facebook product manager said this torrent of fake news has real, harmful consequences to people and society.

She lay the blame at the feet of founder and CEO Mark Zuckerberg: “He knows he has tools he can use today to stop misinformation.”

But to use those tools could mean reducing the amount of content and user engagement on the platform, which translate to lower profits that would disappointment shareholders and Wall Street. “The system today is more profitable,” she said at SXSW 2022 conference in Austin, Texas.

Asked if AI will solve this problem, she said “at some point, AI will be able to” but the issue is teaching computers about nuance. “It’s hard for us to teach nuance to computers” such as learning to discern what is hate speech and what content could inflame people.

Haugen said she used to be skeptical about decentralization – the core feature of Web 3.0 and the idea of having myriad centers of control instead of a few mega-platforms dominating – and the DAO (decentralized autonomous organizations) as a possible solution.

Now, she thinks “this is doable” if there is social media only with content from family and friends.

Facebook vs. Google and Twitter

Haugen, who used to work at Google, said the tech giant does not have Facebook’s problem of misinformation because it is more open about its data and activities.

For example, users can download Google search results to discover what is included and what is excluded. The company has full-time engineers on search who write blogs on how it works.

“Facebook is different. Facebook is a closed system,” she said. “We can’t easily download all the results. That means a lot of personalization. … We don’t know if the experience we are having on Facebook is the same as everyone else’s. Facebook uses this to its advantage.”

Twitter, which also has been accused of being a platform for inflammatory content, at least has a tool that would reduce the amount of misinformation on its platform, Haugen asserted.

“They put a human in the loop. If you want to share a link on Twitter, you have to click on it,” she said. “That little piece of friction lowers misinformation by 10% to 15%.”

Haugen mentioned another solution Facebook could easily deploy: “If you require someone to cut and paste instead of click to share, it’s like having a third-party fact checker.”

But these changes could serve to lessen content and user engagement, which would impact Facebook’s metrics such as monthly or daily active users when it is time to report quarterly earnings.

Haugen makes a point to say that employees at Facebook are not “bad.” Rather, “it’s about product choices that give the most reach to the most extreme ideas.”

About the Author(s)

Deborah Yao

Editor

Deborah Yao runs the day-to-day operations of AI Business. She is a Stanford grad who has worked at Amazon, Wharton School and Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like