The former head of the Facebook app, who reported directly to CEO Mark Zuckerberg, Fidji Simo, defended the social network at the start of an interview at the WSJ Tech Live event this afternoon. The exec was there to discuss her new role as Instacart CEO and her vision for the future of food delivery, but was asked to comment on the recent Facebook whistleblower’s testimony and the attention it has since raised.
Simo said she understood the scrutiny given Facebook’s impact on people’s lives. But she’s also worried that Facebook will never be able to do enough to appease its critics at this point, despite the complexity of the issues Facebook is grappling with as one of the world’s largest social networks.
“They are spending billions of dollars in keeping people safe. They are doing the most in-depth research of any company I know to understand their impact,” she argued, still very much on Facebook’s side, despite her recent departure. “And I think my worry is that people want ‘yes’ or ‘no’ answers to this question, but really these questions require a lot of nuance,” she added.
While the whistleblower, Frances Haugen, suggested that Facebook’s decision to prioritize user engagement through its algorithms was ultimately putting profits over people, Simo cautioned the choices weren’t quite as binary as have been described to date. She explained that making changes based on the research Facebook had invested in wasn’t just a matter of turning a dial and “all of a sudden, magically problems disappear — because Facebook is fundamentally a reflection of humanity,” she said.
Instead, Simo said that the real issues at Facebook were around how every change Facebook makes can have significant societal applications at this point. It has to work to determine how it can improve upon the potentially problematic areas of its business without incidentally affecting other things along the way.
“When we discuss trade-offs, it’s usually trade-offs between two types of societal impacts,” she noted.
As an example, Simo used what would seem like a fairly straightforward adjustment to make: determine which posts make Facebook users angry then show people less of those.
As Haugen had testified, Facebook’s algorithms have been designed to reward engagement. That means posts with “likes” and other interactions spread more widely and are distributed higher up in people’s News Feeds. But she also said engagement doesn’t just come from likes and positive reactions. Engagement-based algorithms will ultimately prioritize clickbait and posts that make people angry. This, in turn, can help to boost the spread of posts eliciting stronger reactions, like misinformation or even toxic and violent content.
Simo, however, said it’s not as simple as it sounds to just dial down the anger across Facebook, as doing so would lead to another type of societal impact.
“You start digging in and you realize that the biggest societal movements were created out of anger,” she said. That led the company to question how it could make a change that could impact people’s activism.
(This isn’t quite how that situation unfolded, according to a report by the WSJ. Instead, when the algorithm was tweaked to prioritize personal posts over professionally produced content, publishers and political parties adjusted their posts toward outrage and sensationalism. And Zuckerberg resisted some of the proposed fixes to this problem, the report said.)
“That’s just a random example,” Simo said of the “anger” problem. “But literally, on every issue, there is always a trade-off that is another type of societal impact. And I can tell you for having been in these rooms for many, many years, it’s really never about like, ‘oh, are we doing the right thing for society, versus the right thing for Facebook and for profits’…the debate was really between some kinds of societal impact and another kind — which is a very hard debate to have as a private company.”
This, she added, was why Facebook wanted regulations.
“It’s not surprising that Facebook has been calling for regulation in this space for a very long time because they never want to be in a position of being the ones deciding which implications, which ramifications, which trade-offs they need to make between one type of societal impact and another type of societal impact. The governments are better positioned to do that,” she said.
Given the increasing amount of evidence coming out that Facebook itself understood, through its own internal research, that there were areas of its business that negatively impact society, Simo didn’t chalk up her departure from the social network to anything that was going on with Facebook itself.
Instead, she said she just wasn’t learning as much after 10 years with the company, and Instacart presented her with a great opportunity where she could learn “a different set of things,” she said.