Meta Grapples with Youth Safety Concerns Amidst Growing Pressure for Action

Meta to limit data

Internal conflicts exist at Meta, the parent company of Facebook, Instagram, and WhatsApp, on how to restrict content deemed hazardous for young people. The business has been under fire for enabling damaging content to proliferate on its platforms, including content about eating disorders and self-harm.

The matter was a part of a larger discussion within Meta on the business’s duty to protect children during the time it was promoting virtual reality, an emerging technology about whose impacts the company understood very little. One of the people stated that meta leaders claimed that parents ought to have the last word on whether or not their child wears a VR headset. The age restriction was changed in September to permit parents to create user accounts for kids between the ages of 10 and 12.

Meta has responded to these critiques by limiting content that poses a risk. For example, pro-anorexia and self-harm hashtags have been banned. These actions have drawn criticism for not going far enough, though.
Some Meta staff members think the firm ought to take additional steps to safeguard young users, like utilizing AI to recognize and eliminate offensive content. Some people think the corporation should just uphold its current rules, which forbid the posting of any illegal or violently suggestive information.

Tensions around this matter have escalated in the past few months as legislators, authorities, and the general public have put more and more pressure on Meta to take action to safeguard younger users.

A group of state attorneys general began looking into how Meta handled youth safety on its platforms in October 2023. The main question under inquiry is whether Meta’s failure to shield younger users from offensive content constitutes a violation of state consumer protection regulations.

Regulators in the US and Europe are also keeping a close eye on Meta in addition to the state inquiry. The way Meta handles user data is one of the corporate practices under investigation by the Federal Trade Commission (FTC). Additionally, new legislation that would hold digital corporations responsible for the content uploaded on their platforms are being considered by the European Union.

In the upcoming months, there will probably be increasing pressure on Meta to take additional steps to safeguard young users. The corporation is faced with a challenging situation where it must safeguard both its business interests and its users’ safety. It’s unclear how Meta will handle this problem in the end.

See More News

See Software guides.

Subscribe

Scroll to Top