Meta to Hide Posts About Suicide and Eating Disorders From Teens’ Instagram and Facebook Feeds!

0

Meta announced on Tuesday that it will begin removing problematic information from minors’ Instagram and Facebook profiles, including posts about suicide, self-harm, and eating disorders.

The Menlo Park, California-based social media platform announced in a blog post that in addition to already not recommending such “age-inappropriate” content to minors, it will now also not display it in their feeds, even if an account they follow shares it.

“We want teens to have safe, age-appropriate experiences on our apps,” Meta told me.

Teen users will also have their accounts placed on the most restrictive settings on Instagram and Facebook if they are not mislead about their age when signing up, and they will be barred from searching for potentially hazardous terms.

“For instance, consider someone who posts about their continuing struggle with suicidal ideation. This is an important story that can help de-stigmatize these concerns, but it’s a complex subject that may not be appropriate for many young people,” Meta remarked. “Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content.”

The corporation is currently facing lawsuits from dozens of states for allegedly harming children and causing the youth mental health crisis by knowingly and purposefully creating features on Instagram and Facebook that cause children to become addicted to their platforms.

“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their kids to online harms on Instagram,” said Josh Golin, executive director of Fairplay, a children’s online advocacy group. “If the company is capable of hiding pro-suicide and eating disorder content, why have they waited until 2024 to announce these changes?”

Leave A Reply

Your email address will not be published.