Connect with us

Social

Former head of Facebook app Fidji Simo defends company following whistleblower testimony – TechCrunch

Published

on

The former head of the Facebook app, who reported directly to CEO Mark Zuckerberg, Fidji Simo, defended the social network at the start of an interview at the WSJ Tech Live event this afternoon. The exec was there to discuss her new role as Instacart CEO and her vision for the future of food delivery, but was asked to comment on the recent Facebook whistleblower’s testimony and the attention it has since raised.

Simo said she understood the scrutiny given Facebook’s impact on people’s lives. But she’s also worried that Facebook will never be able to do enough to appease its critics at this point, despite the complexity of the issues Facebook is grappling with as one of the world’s largest social networks.

“They are spending billions of dollars in keeping people safe. They are doing the most in-depth research of any company I know to understand their impact,” she argued, still very much on Facebook’s side, despite her recent departure. “And I think my worry is that people want ‘yes’ or ‘no’ answers to this question, but really these questions require a lot of nuance,” she added.

While the whistleblower, Frances Haugen, suggested that Facebook’s decision to prioritize user engagement through its algorithms was ultimately putting profits over people, Simo cautioned the choices weren’t quite as binary as have been described to date. She explained that making changes based on the research Facebook had invested in wasn’t just a matter of turning a dial and “all of a sudden, magically problems disappear — because Facebook is fundamentally a reflection of humanity,” she said.

Image Credits: Instacart

Instead, Simo said that the real issues at Facebook were around how every change Facebook makes can have significant societal applications at this point. It has to work to determine how it can improve upon the potentially problematic areas of its business without incidentally affecting other things along the way.

“When we discuss trade-offs, it’s usually trade-offs between two types of societal impacts,” she noted.

As an example, Simo used what would seem like a fairly straightforward adjustment to make: determine which posts make Facebook users angry then show people less of those.

As Haugen had testified, Facebook’s algorithms have been designed to reward engagement. That means posts with “likes” and other interactions spread more widely and are distributed higher up in people’s News Feeds. But she also said engagement doesn’t just come from likes and positive reactions. Engagement-based algorithms will ultimately prioritize clickbait and posts that make people angry. This, in turn, can help to boost the spread of posts eliciting stronger reactions, like misinformation or even toxic and violent content.

Simo, however, said it’s not as simple as it sounds to just dial down the anger across Facebook, as doing so would lead to another type of societal impact.

“You start digging in and you realize that the biggest societal movements were created out of anger,” she said. That led the company to question how it could make a change that could impact people’s activism.

(This isn’t quite how that situation unfolded, according to a report by the WSJ. Instead, when the algorithm was tweaked to prioritize personal posts over professionally produced content, publishers and political parties adjusted their posts toward outrage and sensationalism. And Zuckerberg resisted some of the proposed fixes to this problem, the report said.)

“That’s just a random example,” Simo said of the “anger” problem. “But literally, on every issue, there is always a trade-off that is another type of societal impact. And I can tell you for having been in these rooms for many, many years, it’s really never about like, ‘oh, are we doing the right thing for society, versus the right thing for Facebook and for profits’…the debate was really between some kinds of societal impact and another kind — which is a very hard debate to have as a private company.”

This, she added, was why Facebook wanted regulations.

“It’s not surprising that Facebook has been calling for regulation in this space for a very long time because they never want to be in a position of being the ones deciding which implications, which ramifications, which trade-offs they need to make between one type of societal impact and another type of societal impact. The governments are better positioned to do that,” she said.

Given the increasing amount of evidence coming out that Facebook itself understood, through its own internal research, that there were areas of its business that negatively impact society, Simo didn’t chalk up her departure from the social network to anything that was going on with Facebook itself.

Instead, she said she just wasn’t learning as much after 10 years with the company, and Instacart presented her with a great opportunity where she could learn “a different set of things,” she said.

Continue Reading

Social

Facebook tests a new ‘Professional’ mode for creator profiles – TechCrunch

Published

on

Meta (formerly Facebook) today is introducing a new “Professional” mode for user profiles, designed to be used by creators looking to monetize their followings on the social network. The new mode, which is initially available to select creators in the U.S., will present creators with additional money-making opportunities and expanded insights that had been previously only available to Facebook Pages.

Among these will be the ability for creators to participate in the new Reels Play bonus program, where some creators are able to earn up to $35,000 per month based on the views for their short-form video content. However, access to this program, for the time being, is invite-only — meaning Meta will determine which creators qualify to earn bonuses.

While Meta didn’t share what other monetization options will be available in the days ahead, it did note that it will also make professional-level insights available to these creators, which are similar to what Page owners have access to. This includes access to post, audience and profile insights. For example, creators will be able to now see the total number of shares, reactions and comments that their posts have and be able to view their follower growth over time. This allows them to make better, more informed decisions about the content they post and how it resonates with their audience.

Image Credits: Meta

While many creators are already using Facebook profiles instead of Pages to attract fans and followers, Meta warns that others who decide to opt into this new experience will be opening themselves up to being more of a public figure on the social network. That means anyone can follow them and see the public content posted to their feed, but they’ll be able to mark posts as either public or friends-only, as you could otherwise on a private profile.

Meanwhile, creators who are using Facebook Pages will be opted into the new Pages experience instead. This will provide access to a Professional Dashboard that will serve as a central destination for admins to review the Page’s performance and access professional tools and insights, the company notes. Facebook is also testing a two-step composer on Pages. which allows creators to schedule posts and cross-post into a group.

The changes come at a time when Meta is heavily investing in its creator user base, as it sees the potential in a new revenue stream that comes from things like creator subscriptions and virtual tips, aka “Stars” — the latter which it just made available yesterday outside the app stores through a new website where it no longer has to pay commissions to Apple and Google. The company earlier said it was planning to lure in creators with $1 billion in payments, like the Reels bonuses among other things, as the competition for creator talent heats up with TikTik and other top social apps, like YouTube, Twitter, Snapchat and others.

Meta notes that the new Professional mode is still in testing with select creators in the U.S. for now, but will roll out more broadly in the future, including to the EMEA region.

Continue Reading

Social

Reddit to roll out personalized end-of-year recaps with stats about users’ habits – TechCrunch

Published

on

Reddit is launching a new personalized Spotify Wrapped-like recap feature for all users tomorrow. The new recaps will include a variety of stats, including a summary of the time you spent on the platform, a look at the content that you interacted with or contributed, topics you engaged with and communities you’ve viewed or joined. Reddit notes that users will be able to hide their username and avatar if they want when sharing the recap across other social media apps.

“In previous years, Reddit Recap focused on aggregated trends across the platform. This year we wanted to add a fun, personalized in-product experience to remind users of their contributions and belonging on the platform,” Reddit said in a statement. “Every Redditor has a unique role to play on Reddit, and so we referenced user browsing and engagement data from January 1st, 2021 to November 30th, 2021 to help shape the stories about how they fit in.”

End-of-year recaps have become increasingly popular thanks to Spotify’s annual Wrapped feature that is widely shared across social media each year. Given its success, it’s no surprise that other companies like Apple, YouTube, Snapchat and now Reddit are looking to mimic the popular feature with their own versions.

In addition to the launch of recaps, Reddit has released data about the most popular themes on the platform in 2021. The company notes that cryptocurrency, gaming, sports, weddings, health and fitness, food and drink, and movies and television were the most popular categories. In terms of cryptocurrency, the top five most-viewed crypto communities this year were r/dogecoin, r/superstonk. r/cryptocurrency, r/amcstock and r/bitcoin. So far this year, Reddit has seen 6.6 million mentions of “crypto” across its platform.

As for gaming, the top five most viewed communities in 2021 were r/genshinimpact, r/leagueoflegends, r/gaming, r/rpclipsgta and r/ffxiv. For the sports category, the top five communities were r/nba, r/soccer, r/nfl, r/squaredcircle and r/mma. In terms of the weddings category, the top five communities were r/weddingplanning, r/engagementrings, r/bridezillas, r/wedding and r/weddingsunder10k.

Regarding health and fitness, the top five communities were r/lifeprotips, r/sports, r/progresspics, r/fitness and r/loseit. As for the food and drink category, the top five were r/food, r/cooking, r/keto, r/kitchenconfidential and r/starbucks. Lastly, the top five communities in the movies and television category were r/movies, r/marvelstudios, r/starwars, r/moviedetails and r/dc_cinematic.

Reddit also revealed that users created 366 million posts in 2021, which is a 19% year-over-year increase. The company has seen 2.3 billion total comments, a 12 percent increase year-over-year and 46 billion total upvotes, a 1% increase year-over-year so far this year.

Continue Reading

Social

Instagram announces plans for parental controls and other safety features ahead of congressional hearing – TechCrunch

Published

on

On Wednesday, Instagram head Adam Mosseri is set to testify before the Senate for the first time on the issue of how the app is impacting teens’ mental health, following the recent testimonies from Facebook whistleblower Frances Haugen, which have positioned the company as caring more about profits than user safety. Just ahead of that hearing, Instagram has announced a new set of safety features, including its first set of parental controls.

The changes were introduced through a company blog post, authored by Mosseri.

Not all the features are brand new, and some are smaller expansions on earlier safety features the company already had in the works.

However, the bigger news today is Instagram’s plan to launch its first set of parental control features in March. These features will allow parents and guardians to see how much time teens spend on Instagram and will allow them to set screen time limits. Teens will also be given an option to alert parents if they report someone. These tools are an opt-in experience — teens can choose not to send alerts, and there’s no requirement that teens and parents have to use parental controls.

The parental controls, as described, are also less powerful than those on rival TikTok, where parents can lock children’s accounts into restricted experience, block access to search, as well as control their child’s visibility on the platform and who can view their content, comment or message them. Screen time limits, meanwhile, are already offered by the platforms themselves — that is, Apple’s iOS and Google’s Android mobile operating systems offer similar controls. In other words, Instagram isn’t doing much here in terms of innovative parental controls, but notes it will “add more options over time.”

Another new feature was previously announced. Instagram earlier this month launched a test of its new “Take a Break” feature, which allows users to remind themselves to take a break from using the app after either 10, 20 or 30 minutes, depending on their preference. This feature will now officially launch in the U.S., U.K., Ireland, Canada, Australia and New Zealand.

Image Credits: Instagram

Unlike on rival TikTok, where videos that push users to get off the app appear in the main feed after a certain amount of time, Instagram’s “Take a Break” feature is opt-in only. The company will begin to suggest to users that they set these reminders, but it will not require they do so. That gives Instagram the appearance of doing something to combat app addiction, without going so far as to actually make “Take a Break” enabled by default for its users, or like TikTok, regularly remind users to get off the app.

Another feature is an expansion of earlier efforts around distancing teens from having contact with adults. Already, Instagram began to default teens’ accounts to private, and restrict target advertising and unwanted adult contact — the latter by using technology to identify “potentially suspicious behavior” from adult users, then preventing them from being able to interact with teens’ accounts. It has also restricted other adult users from being able to contact teens who didn’t already follow them, and sends the teen notifications if the adult is engaging in suspicious behavior, while giving them tools for blocking and reporting.

Now it will expand this set of features to also switch off the ability for adults to tag or mention teens who don’t follow them, and to include their content in Reels Remixes (video content), or Guides. These will be the new default settings, and will roll out next year.

Image Credits: Instagram

Instagram says it will also be stricter about what’s recommended to teens in sections of the app like Search, Explore, Hashtags and Suggested Accounts.

But in describing the action it’s taking, the company seems to have not yet made a hard decision on what will be changed. Instead, Instagram says it’s “exploring” the idea of limiting content in Explore, using a newer set of sensitive content control features launched in July. The company says it’s considering expanding the “Limit Even More” — the strictest setting — to include not just Explore, but also Search, Hashtags, Reels and Suggested Accounts.

Image Credits: Instagram

It also says if it sees people are dwelling on a topic for a while it may nudge them to other topics, but doesn’t share details on this feature, as it’s under development. Presumably, this is meant to address the issues raised about teens who are exploring potentially harmful content, like those that could trigger eating disorders, anxiety or depression. In practice, the feature could also be used to direct users to more profitable content for the app — like posts from influencers who drive traffic to monetizable products, like Instagram Shopping, LIVE videos, Reels and others.

Image Credits: Instagram

Instagram will also roll out tools this January that allow users to bulk delete photos and videos from their account to clean up their digital footprint. The feature will be offered as part of a new hub where users can view and manage their activity on the app.

Image Credits: Instagram

This addition is being positioned as a safety feature, as older users may be able to better understand what it means to share personal content online; and they may have regrets over their older posts. However, a bulk deletion option is really the sort of feature that any content management system (that’s behaving ethically) should offer its users — meaning not just Instagram, but also Facebook, Twitter and other social networks.

The company said these are only some of the features it has in development and noted it’s still working on its new solution to verify people’s ages on Instagram using technology.

As always, I’m grateful to the experts and researchers who lend us their expertise in critical areas like child development, teen mental health and online safety,” Mosseri wrote, “and I continue to welcome productive collaboration with lawmakers and policymakers on our shared goal of creating an online world that both benefits and protects many generations to come,” he added. 

In response to Meta’s announcement, Sen. Marsha Blackburn (R-TN) issued the following statement:

Meta is attempting to shift attention from their mistakes by rolling out parental guides, use timers, and content control features that consumers should have had all along. This is a hollow “product announcement” in the dead of night that will do little to substantively make their products safer for kids and teens. But my colleagues and I see right through what they are doing. We know that Meta and their Silicon Valley allies will continue pushing the envelope out of selfishness and greed until they can no longer do so.

Continue Reading

Trending