Site icon The American Front

Meta updates its teen safety safeguards in response to fresh complaints.

By introducing new settings for teenage Facebook and Instagram users, such as content limitations and the ability to hide search results for phrases associated with self-harm and suicide, Meta stated on Tuesday that it is extending its efforts to promote youth safety.

The parent firm of Instagram and Facebook announced that the new guidelines will complement its current suite of over thirty parental control and well-being features designed to safeguard young users.

The announcement on Tuesday follows months of increased scrutiny for Meta regarding its possible effects on teenage users.

During a hearing in November, Arturo Bejar, a former employee of Facebook who is now a whistleblower, said to a Senate panel that CEO Mark Zuckerberg and other senior Meta officials disregarded warnings for years on the risks their platforms, particularly Instagram, posed to minors. Bejar expressed special concerns with strangers harassing teenagers on Instagram for sex-related reasons.

In the same month, internal business documents that suggested Zuckerberg frequently obstructed teen well-being programs were made public through unsealed court files in one lawsuit against the corporation.

A few weeks later, court records revealed in an unrelated lawsuit claimed that Meta had purposefully refused to close the majority of accounts held by minors under the age of thirteen while gathering their personal data without the permission of their parents.

In December, the Attorney General of New Mexico brought a second complaint against Meta, alleging that the business was providing a “breeding ground” for child predators.

Two years have passed after Facebook’s management of youth safety was called into question by another whistleblower, Frances Haugen, who disclosed a wealth of internal papers. This is when the current pressure is coming from. The “Facebook Papers” caused a stir among lawmakers and the general public, leading to initiatives by Meta and other social media companies to strengthen their young user safeguards.

Meta stated in a blog post on Tuesday that it wished for “teens to have safe, age-appropriate experiences on our apps.”

According to Meta, even if it is shared by someone they follow, it will begin removing postings that deal with self-harm, eating disorders, nudity, or prohibited items from teenagers’ feeds and stories.

It further stated that, by default, all teenagers who use Facebook and Instagram will be placed in its most stringent content recommendation settings, which make it harder to find potentially sensitive content when searching or exploring the app. Previously, this policy only applied to newly registered teenagers.

Additionally, the business is broadening the list of search phrases for eating disorders, self-harm, and suicide for which it conceals results and instead sends users to resources for support. Terms like “self-harm thoughts” and “bulimic” will now be included in that list, which will be updated in the upcoming weeks, according to Meta.

When someone uploads anything “related to their struggles with self-harm or eating disorders,” Meta said it intends to keep offering resources from groups like the National Alliance on Mental Illness.

Additionally, Meta stated that it will ask teenagers to check their privacy and safety settings.

They will have the option to “turn on recommended settings” with just one tap, which will modify their settings automatically to limit who may tag, mention, or message them, as well as who can repost their content and “remix” and share their reels.

Additionally, Meta stated that the settings will “help hide offensive comments.”

Raúl Torrez, the attorney general of New Mexico, and whistleblower Bejar have expressed worries about adult strangers being able to easily message and approach underage users on Facebook and Instagram. These concerns may be allayed by updating the privacy settings of teenagers.

The updates made on Tuesday supplement Meta’s already-available teen safety and parental control features, which also let parents monitor how much time their children spend using the company’s apps, remind them to take breaks during protracted scrolling sessions, and alert them when a teen reports another user.

 

Exit mobile version