Connect with us

Social

Facebook Spotted Testing Feature That Lets Users Ban Comments With Chosen Words, Phrases

Published

on

Facebook is reportedly testing an interesting new feature that will help users combat abuse, bullying, and harassment on the platform. Online harassment has been a major issue that has plagued most social media websites for a long time now. Facebook, for its part, has been taking a number of actions to fight against the issue, and the upcoming feature appears to be another one in the lineup. According to a developer, Facebook users will soon be able to block comments containing particular words, phrases, or emoji from appearing on their personal timelines.

As per a post by developer Jane Wong on Twitter, the new Facebook feature will allow you to select a particular word, a group of words, or even emojis that you do not want to appear in your timeline. Wong also shared a screenshot of the feature that is currently under development. As per the image, when you block certain words Facebook will notify that the users who post the comments and their friends will still be able to see them.

The new comment filtering feature is quite similar to the one that Facebook-owned Instagram already has. Instagram has been allowing users to filter comments, using default keywords and custom keywords, for a long time now. Similarly, Twitter also has a feature called Muted Words that lets users ban particular words from their timelines.

Facebook may let users pick works that they want to filter out or it may suggest certain words or phrases itself. As of now, there is no clarity on whether the feature will be available for the news feed as well. It is worth noting that Facebook is currently testing the feature and it has not been rolled out yet. As of now, there is no official information regarding the feature from the company.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Social

Facebook ordered not to apply controversial WhatsApp T&Cs in Germany – TechCrunch

Published

on

The Hamburg data protection agency has banned Facebook from processing the additional WhatsApp user data that the tech giant is granting itself access to under a mandatory update to WhatsApp’s terms of service.

The controversial WhatsApp privacy policy update has caused widespread confusion around the world since being announced — and already been delayed by Facebook for several months after a major user backlash saw rivals messaging apps benefitting from an influx of angry users.

The Indian government has also sought to block the changes to WhatApp’s T&Cs in court — and the country’s antitrust authority is investigating.

Globally, WhatsApp users have until May 15 to accept the new terms (after which the requirement to accept the T&Cs update will become persistent, per a WhatsApp FAQ).

The majority of users who have had the terms pushed on them have already accepted them, according to Facebook, although it hasn’t disclosed what proportion of users that is.

But the intervention by Hamburg’s DPA could further delay Facebook’s rollout of the T&Cs — at least in Germany — as the agency has used an urgency procedure, allowed for under the European Union’s General Data Protection Regulation (GDPR), to order the tech giant not to share the data for three months.

A WhatsApp spokesperson disputed the legal validity of Hamburg’s order — calling it “a fundamental misunderstanding of the purpose and effect of WhatsApp’s update” and arguing that it “therefore has no legitimate basis”.

“Our recent update explains the options people have to message a business on WhatsApp and provides further transparency about how we collect and use data. As the Hamburg DPA’s claims are wrong, the order will not impact the continued roll-out of the update. We remain fully committed to delivering secure and private communications for everyone,” the spokesperson added, suggesting that Facebook-owned WhatsApp may be intending to ignore the order.

We understand that Facebook is considering its options to appeal Hamburg’s procedure.

The emergency powers Hamburg is using can’t extend beyond three months but the agency is also applying pressure to the European Data Protection Board (EDPB) to step in and make what it calls “a binding decision” for the 27 Member State bloc.

We’ve reached out to the EDPB to ask what action, if any, it could take in response to the Hamburg DPA’s call.

The body is not usually involved in making binding GDPR decisions related to specific complaints — unless EU DPAs cannot agree over a draft GDPR decision brought to them for review by a lead supervisory authority under the one-stop-shop mechanism for handling cross-border cases.

In such a scenario the EDPB can cast a deciding vote — but it’s not clear that an urgency procedure would qualify.

In taking the emergency action, the German DPA is not only attacking Facebook for continuing to thumb its nose at EU data protection rules, but throwing shade at its lead data supervisor in the region, Ireland’s Data Protection Commission (DPC) — accusing the latter of failing to investigate the very widespread concerns attached to the incoming WhatsApp T&Cs.

(“Our request to the lead supervisory authority for an investigation into the actual practice of data sharing was not honoured so far,” is the polite framing of this shade in Hamburg’s press release).

We’ve reached out to the DPC for a response and will update this report if we get one.

Ireland’s data watchdog is no stranger to criticism that it indulges in creative regulatory inaction when it comes to enforcing the GDPR — with critics charging commissioner Helen Dixon and her team of failing to investigate scores of complaints and, in the instances when it has opened probes, taking years to investigate — and opting for weak enforcements at the last.

The only GDPR decision the DPC has issued to date against a tech giant (against Twitter, in relation to a data breach) was disputed by other EU DPAs — which wanted a far tougher penalty than the $550k fine eventually handed down by Ireland.

GDPR investigations into Facebook and WhatsApp remain on the DPC’s desk. Although a draft decision in one WhatsApp data-sharing transparency case was sent to other EU DPAs in January for review — but a resolution has still yet to see the light of day almost three years after the regulation begun being applied.

In short, frustrations about the lack of GDPR enforcement against the biggest tech giants are riding high among other EU DPAs — some of whom are now resorting to creative regulatory actions to try to sidestep the bottleneck created by the one-stop-shop (OSS) mechanism which funnels so many complaints through Ireland.

The Italian DPA also issued a warning over the WhatsApp T&Cs change, back in January — saying it had contacted the EDPB to raise concerns about a lack of clear information over what’s changing.

At that point the EDPB emphasized that its role is to promote cooperation between supervisory authorities. It added that it will continue to facilitate exchanges between DPAs “in order to ensure a consistent application of data protection law across the EU in accordance with its mandate”. But the always fragile consensus between EU DPAs is becoming increasingly fraught over enforcement bottlenecks and the perception that the regulation is failing to be upheld because of OSS forum shopping.

That will increase pressure on the EDPB to find some way to resolve the impasse and avoid a wider break down of the regulation — i.e. if more and more Member State agencies resort to unilateral ’emergency’ action.

The Hamburg DPA writes that the update to WhatsApp’s terms grant the messaging platform “far-reaching powers to share data with Facebook” for the company’s own purposes (including for advertising and marketing) — such as by passing WhatApp users’ location data to Facebook and allowing for the communication data of WhatsApp users to be transferred to third-parties if businesses make use of Facebook’s hosting services.

Its assessment is that Facebook cannot rely on legitimate interests as a legal base for the expanded data sharing under EU law.

And if the tech giant is intending to rely on user consent it’s not meeting the bar either because the changes are not clearly explained nor are users offered a free choice to consent or not (which is the required standard under GDPR).

“The investigation of the new provisions has shown that they aim to further expand the close connection between the two companies in order for Facebook to be able to use the data of WhatsApp users for their own purposes at any time,” Hamburg goes on. “For the areas of product improvement and advertising, WhatsApp reserves the right to pass on data to Facebook companies without requiring any further consent from data subjects. In other areas, use for the company’s own purposes in accordance to the privacy policy can already be assumed at present.

“The privacy policy submitted by WhatsApp and the FAQ describe, for example, that WhatsApp users’ data, such as phone numbers and device identifiers, are already being exchanged between the companies for joint purposes such as network security and to prevent spam from being sent.”

DPAs like Hamburg may be feeling buoyed to take matters into their own hands on GDPR enforcement by a recent opinion by an advisor to the EU’s top court, as we suggested in our coverage at the time. Advocate General Bobek took the view that EU law allows agencies to bring their own proceedings in certain situations, including in order to adopt “urgent measures” or to intervene “following the lead data protection authority having decided not to handle a case.”

The CJEU ruling on that case is still pending — but the court tends to align with the position of its advisors.

 

Continue Reading

Social

Facebook is testing pop-up messages telling people to read a link before they share it – TechCrunch

Published

on

Years after popping open a pandora’s box of bad behavior, social media companies are trying to figure out subtle ways to reshape how people use their platforms.

Following Twitter’s lead, Facebook is trying out a new feature designed to encourage users to read a link before sharing it. The test will reach 6 percent of Facebook’s Android users globally in a gradual rollout that aims to encourage “informed sharing” of news stories on the platform.

Users can still easily click through to share a given story, but the idea is that by adding friction to the experience, people might rethink their original impulses to share the kind of inflammatory content that currently dominates on the platform.

Twitter introduced prompts urging users to read a link before retweeting it last June and the company quickly found the test feature to be successful, expanding it to more users.

Facebook began trying out more prompts like this last year. Last June, the company rolled out pop-up messages to warn users before they share any content that’s more than 90 days old in an an effort to cut down on misleading stories taken out of their original context.

At the time, Facebook said it was looking at other pop-up prompts to cut down on some kinds of misinformation. A few months later, Facebook rolled out similar pop-up messages that noted the date and the source of any links they share related to COVID-19.

The strategy demonstrates Facebook’s preference for a passive strategy of nudging people away from misinformation and toward its own verified resources on hot button issues like COVID-19 and the 2020 election.

While the jury is still out on how much of an impact this kind of gentle behavioral shaping can make on the misinformation epidemic, both Twitter and Facebook have also explored prompts that discourage users from posting abusive comments.

Pop-up messages that give users a sense that their bad behavior is being observed might be where more automated moderation is headed on social platforms. While users would probably be far better served by social media companies scrapping their misinformation and abuse-ridden existing platforms and rebuilding them more thoughtfully from the ground up, small behavioral nudges will have to do.

Continue Reading

Social

State AGs tell Facebook to scrap Instagram for kids plans – TechCrunch

Published

on

In a new letter, attorneys general representing 44 U.S. states and territories are pressuring Facebook to walk away from new plans to open Instagram to children. The company is working on an age-gated version of Instagram for kids under the age of 13 that would lure in young users who are currently not permitted to use the app, which was designed for adults.

“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” the coalition of attorneys general wrote, warning that an Instagram for kids would be “harmful for myriad reasons.”

The state attorneys general call for Facebook to abandon its plans, citing concerns around developmental health, privacy and Facebook’s track record of prioritizing growth over the well being of children on its platforms. In the letter, embedded below, they delve into specific worries about cyberbullying, online grooming by sexual predators and algorithms that showed dieting ads to users with eating disorders.

Concerns about social media and mental health in kids and teens is a criticism we’ve been hearing more about this year, as some Republicans join Democrats in coalescing around those issues, moving away from the claims of anti-conservative bias that defined politics in tech during the Trump years.

Leaders from both parties have been openly voicing fears over how social platforms are shaping young minds in recent months amidst calls to regulate Facebook and other social media companies. In April, a group of Congressional Democrats wrote Facebook with similar warnings over its new plans for children, pressing the company for details on how it plans to protect the privacy of young users.

In light of all the bad press and attention from lawmakers, it’s possible that the company may walk back its brazen plans to boost business by bringing more underage users into the fold. Facebook is already in the hot seat with state and federal regulators in just about every way imaginable. Deep worries over the company’s future failures to protect yet another vulnerable set of users could be enough to keep these plans on the company’s back burner.

Continue Reading

Trending