Connect with us

Social

WhatsApp Moves Wallpaper Option, Adds New Skin Tones for Select Emojis in Latest Beta for Android

Published

on

WhatsApp is receiving a new update for its beta build that bumps up the version number to 2.19.366. Along with various tweaks, this update mainly fixes a serious bug that caused the last version of WhatsApp beta for Android to crash for many users. This bug found in version 2.19.335 has seemingly been fixed in the latest update, along with other notable changes. Apart from this, it has been long rumoured that dark mode will soon be making its way to WhatsApp, but in this new update as well, the feature is nowhere to be found.

The new update for WhatsApp beta is live on the Google Play Store, and if you wish to become a tester then head over to this link to download and install the latest beta version of WhatsApp on your device.

As we mentioned above, the new update for WhatsApp beta bumps up the version number to 2.19.366. The beta update fixes an issue where WhatsApp used to crash for many people. As for the visual changes, a report by WhatsApp features tracker WABetaInfo mentions that the Wallpaper option that was previously found under chat settings has now been moved to a separate Display section. Additionally, WhatsApp has also added new skin tones for six different emojis. The emojis are ‘woman in manual wheelchair’, ‘man in manual wheelchair’, ‘woman in motorized wheelchair’, ‘man in motorized wheelchair’, ‘woman with probing cane’, ‘man with probing cane’.

So, if you want to test the new version for yourself, just head over to the Google Play Store and download the latest WhatsApp beta update on your Android smartphone. And just in case, if you want to opt-out of the beta version, first head over to the WhatsApp beta listing on Play Store and tap on Leave the program, then uninstall the beta app from your phone and install the public version of the app from the Play Store.

Source link



Source link

Continue Reading

Social

Facebook rolls out new tools for Group admins, including automated moderation aids – TechCrunch

Published

on

Facebook today introduced a new set of tools aimed at helping Facebook Group administrators get a better handle on their online communities and, potentially, help keep conversations from going off the rails. Among the more interesting new tools is a machine learning-powered feature that alerts admins to potentially unhealthy conversations taking place in their group. Another lets the admin slow down the pace of a heated conversation, by limiting how often group members can post.

Facebook Groups are today are significant reason why people continue to use the social network. Today, there are “tens of millions” of groups, that are managed by over 70 million active admins and moderators worldwide, Facebook says.

The company for years has been working to roll out better tools for these group owners, who often get overwhelmed by the administrative responsibilities that come with running an online community at scale. As a result, many admins give up the job and leave groups to run somewhat unmanaged — thus allowing them to turn into breeding grounds for misinformation, spam and abuse.

Facebook last fall tried to address this problem by rolling out new group policies to crack down on groups without an active admin, among other things. Of course, the company’s preference would be to keep groups running and growing by making them easier to operate.

That’s where today’s new set of features come in.

A new dashboard called Admin Home will centralize admin tools, settings and features in one place, as well as present “pro tips” that suggest other helpful tools tailored to the group’s needs.

Image Credits: Facebook

Another new Admin Assist feature will allow admins to automatically moderate comments in their groups by setting up criteria that can restrict comments and posts more proactively, instead of forcing admins to go back after the fact and delete them, which can be problematic — especially after a discussion has been underway and members are invested in the conversation.

For example, admins can now restrict people from posting if they haven’t had a Facebook account for very long or if they had recently violated the group’s rules. Admins can also automatically decline posts that contain specific promotional content (perhaps MLM links! Hooray!) and then share feedback with the author of the post automatically about why those posts aren’t allowed.

Admins can also take advantage of suggested preset criteria from Facebook to help with limiting spam and managing conflict.

Image Credits: Facebook

One notable update is a new moderation alert type dubbed “conflict alerts.” This feature, currently in testing, will notify admins when a potentially contentious or unhealthy conversation is taking place in the group, Facebook says. This would allow an admin to quickly take an action — like turning off comments, limiting who could comment, removing a post, or however else they would want to approach the situation.

Conflict alerts are powered by machine learning, Facebook explains. Its machine learning model looks at multiple signals, including reply time and comment volume to determine if engagement between users has or might lead to negative interactions, the company says.

This is sort of like an automated expansion on the Keyword Alerts feature many admins already use to look for certain topics that lead to contentious conversations.

Image Credits: Facebook

A related feature, also new, would allow admins to also limit how often specific members could comment, or how often comments could be added to posts admins select.

When enabled, members can leave 1 comment every 5 minutes. The idea here is that forcing users to pause and consider their words amid a heated debate could lead to more civilized conversations. We’ve seen this concept enacted on other social networks, as well — such as with Twitter’s nudges to read articles before retweeting, or those that flag potentially harmful replies, giving you a chance to re-edit your post.

Image Credits: Facebook

Facebook, however, has largely embraced engagement on its platform, even when it’s not leading to positive interactions or experiences. Though small, this particular feature is an admission that building a healthy online community means sometimes people shouldn’t be able to immediately react and comment with whatever thought first popped into their head.

Additionally, Facebook is testing tools that allow admins to temporarily limit activity from certain group members.

If used, admins will be able to determine how many posts (between 1 and 9 posts) per day a given member may share, and for how long that limit should be in effect for (every 12 hours, 24 hours, 3 days, 7 days, 14 days, or 28 days). Admins will also be able to determine how many comments (between 1 and 30 comments, in 5 comment increments) per hour a given member may share, and for how long that limit should be in effect (also every 12 hours, 24 hours, 3 days, 7 days, 14 days, or 28 days).

Along these same lines of building healthier communities, a new member summary feature will give admins an overview of each member’s activity on their group, allowing them to see how many times they’ve posted and commented, have had posts removed, or have been muted.

Image Credits: Facebook

Facebook doesn’t say how admins are to use this new tool, but one could imagine admins taking advantage of the detailed summary to do the occasional cleanup of their member base by removing bad actors who continually disrupt discussions. They could also use it to locate and elevate regulator contributors without violations to moderator roles, perhaps.

Admins will also be able to tag their group rules in comment sections, disallow certain post types (e.g. Polls or Events), and submit an appeal to Facebook to re-review decisions related to group violations, if in error.

Image Credits: Facebook

Of particular interest, though a bit buried amid the slew of other news, is the return of Chats, which was previously announced.

Facebook had abruptly removed Chat functionality back in 2019, possibly due to spam, some had speculated. (Facebook said it was product infrastructure.) As before, Chats can have up to 250 people, including active members and those who opted into notifications from the chats. Once this limit is reached, other members will not be able to engage with that specific chat room until existing active participants either leave the chat or opt out of notifications.

Now, Facebook group members can start, find and engage in Chats with others within Facebook Groups instead of using Messenger. Admins and moderators can also have their own chats.

Notably, this change follows on the heels of growth from messaging-based social networks, like IRL, a new unicorn (due to its $1.17B valuation), as well as the growth seen by other messaging apps, like Telegram, Signal and other alternative social networks.

Image Credits: Facebook

Along with this large set of new features, Facebook also made changes to some existing features, based on feedback from admins.

It’s now testing pinned comments and introduced a new “admin announcement” post type that notifies group members of the important news (if notifications are being received for that group).

Plus, admins will be able to share feedback when they decline group members.

Image Credits: Facebook

The changes are rolling out across Facebook Groups globally in the coming weeks.

Continue Reading

Social

Spotify launches its live audio app and Clubhouse rival, Spotify Greenroom – TechCrunch

Published

on

In March, Spotify announced it was acquiring the company behind the sports-focused audio app Locker Room to help speed its entry into the live audio market. Today, the company is making good on that deal with the launch of Spotify Greenroom, a new mobile app that allows Spotify users worldwide to join or host live audio rooms, and optionally turn those conversations into podcasts. It’s also announcing a Creator Fund which will help to fuel the new app with more content in the future.

The Spotify Greenroom app itself is based on Locker Room’s existing code. In fact, Spotify tells us, current Locker Room users will see their app update to become the rebranded and redesigned Greenroom experience, starting today.

Where Locker Room had used a white-and-reddish orange color scheme, the new Greenroom app looks very much like an offshoot from Spotify, having adopted the same color palette, font and iconography.

To join the new app, Spotify users will sign in with their current Spotify account information. They’ll then be walked through an onboarding experience designed to connect them with their interests.

Image Credits: Spotify

For the time being, the process of finding audio programs to listen to relies primarily on users joining groups inside the app. That’s much like how Locker Room had operated, where its users would find and follow favorite sports teams. However, Greenroom’s groups are more general interest now, as it’s no longer only tied to sports.

In time, Spotify tells us the plan is for Greenroom to leverage Spotify’s personalization technology to better connect users to content they would want to hear. For example, it could send out notifications to users if a podcaster you already followed on Spotify went live on Spotify Greenroom. Or it could leverage its understanding of what sort of podcasts and music you listen to in order to make targeted recommendations. These are longer-term plans, however.

As for Spotify Greenroom’s feature set, it’s largely on par with other live audio offerings — including those from Clubhouse, Twitter (Spaces) and Facebook (Live Audio Rooms). Speakers in the room appear at the top of the screen as rounded profile icons, while listeners appear below as smaller icons. There are mute options, moderation controls, and the ability to bring listeners on stage during the live audio session. Rooms can host up to 1,000 people, and Spotify expects to scale that number up later on.

Image Credits: Spotify

Listeners can also virtually applaud speakers by giving them “gems” in the app — a feature that came over from Locker Room, too. The number of gems a speaker earned displays next to their profile image during a session. For now, there’s no monetary value associated with the gems, but that seems an obvious next step as Greenroom today offers no form of monetization.

It’s worth noting there are a few key differentiators between Spotify Greenroom and similar live audio apps. For starters, it offers a live text chat feature that the host can turn on or off whenever they choose. Hosts can also request the audio file of their live audio session after it wraps, which they can then edit to turn into a podcast episode.

Perhaps most importantly is that the live audio sessions are being recorded by Spotify itself. The company says this is for moderation purposes. If a user reports something in a Greenroom audio room, Spotify can go back to look into the matter, to determine what sort of actions may need to be taken. This is an area Clubhouse has struggled with, as its users have sometimes encountered toxicity and abuse in the app in real-time, including in troubling areas like racism and misogyny. Recently, Clubhouse said it had to shut down a number of rooms for antisemitism and hate speech, as well.

Spotify says the moderation of Spotify Greenroom will be handled by its existing content moderation team. Of course, how quickly Spotify will be able react to boot users or shut down live audio rooms that are in violation of its Code of Conduct remains to be seen.

While the app launching today is focused on user-generated live audio content, Spotify has larger plans for Greenroom. Later this summer, the company plans to make announcements around programmed content — something it says is a huge priority — alongside the launch of other new features. This will include programming related to music, culture, and entertainment, in addition the to sports content Locker Room was known for.

Image Credits: Spotify

The company also says it will be marketing Spotify Greenroom to artists through its Spotify for Artists channels, in hopes of seeding the app with more music-focused content. And it confirmed that monetization options for creators will come further down the road, too, but isn’t talking about what those may look like in specific detail for the moment.

In addition, Spotify is today announcing the Spotify Creator Fund, which will help audio creators in the U.S. generate revenue for their work. The company, however, declined to share any details on this front, either– like the size of fund, how much creators would receive, time frame for distributions, selection criteria or other factors. Instead, it’s only offering a sign-up form for those who may be interested in hearing more about this opportunity in the future. That may make it difficult for creators to weigh their options, when there are now so many.

Spotify Greenroom is live today on both iOS and Android across 135 markets around the world. That’s not quite the global footprint of Spotify itself, though, which is available in 178 markets. It’s also only available in the English language for the time being, but plans on expanding as it grows.

Continue Reading

Social

Biden admin will share more info with online platforms on ‘front lines’ of domestic terror fight – TechCrunch

Published

on

The Biden administration is outlining new plans to combat domestic terrorism in light of the January 6 attack on the U.S. Capitol and social media companies have their own part to play.

The White House released a new national strategy on countering domestic terrorism Tuesday. The plan acknowledges the key role that online platforms play in bringing violent ideas into the mainstream, going as far as calling social media sites the “front lines” of the war on domestic terrorism.

“The widespread availability of domestic terrorist recruitment material online is a national security threat whose front lines are overwhelmingly private–sector online platforms, and we are committed to informing more effectively the escalating efforts by those platforms to secure those front lines,” the White House plan states.

The Biden administration committed to more information sharing with the tech sector to fight the tide of online extremism, part of a push to intervene well before extremists can organize violence. According to a fact sheet on the new domestic terror plan, the U.S. government will prioritize “increased information sharing with the technology sector,” specifically online platforms where extremism is incubated and organized.

“Continuing to enhance the domestic terrorism–related information offered to the private sector, especially the technology sector, will facilitate more robust efforts outside the government to counter terrorists’ abuse of Internet–based communications platforms to recruit others to engage in violence,” the White House plan states.

In remarks timed with the release of the domestic terror strategy, Attorney General Merrick Garland asserted that coordinating with the tech sector is “particularly important” for interrupting extremists who organize and recruit on online platforms and emphasized plans to share enhanced information on potential domestic terror threats.

In spite of the new initiatives, the Biden administration admits that that domestic terrorism recruitment material will inevitably remain available online, particularly on platforms that don’t prioritize its removal — like most social media platforms, prior to January 2021 — and on end-to-end encrypted apps, many of which saw an influx of users when social media companies cracked down on extremism in the U.S. earlier this year.

“Dealing with the supply is therefore necessary but not sufficient: we must address the demand too,” the White House plan states. “Today’s digital age requires an American population that can utilize essential aspects of Internet–based communications platforms while avoiding vulnerability to domestic terrorist recruitment and other harmful content.”

The Biden administration will also address vulnerability to online extremism through digital literacy programs, including “educational materials” and “skills–enhancing online games” designed to inoculate Americans against domestic extremism recruitment efforts, and presumably disinformation and misinformation more broadly.

The plan stops short of naming domestic terror elements like QAnon and the “Stop the Steal” movement specifically, though it acknowledges the range of ways domestic terror can manifest, from small informal groups to organized militias.

A report from the Office of the Director of National Intelligence in March observed the elevated threat to the U.S. that domestic terrorism poses in 2021, noting that domestic extremists leverage mainstream social media sites to recruit new members, organize in-person events and share materials that can lead to violence.

Continue Reading

Trending