To change location

  • alSobh
  • alChourouq
  • alDohr
  • alAsr
  • alMaghrib
  • alIchae

Follow Us on Facebook

YouTube Alters Algorithms to Shield Teens from Idealized Body Content

Thursday 05 - 12:00
YouTube Alters Algorithms to Shield Teens from Idealized Body Content

YouTube is overhauling its recommendation system to limit teenagers' exposure to videos that promote idealized body images and fitness standards. This shift comes in response to concerns that such content, when viewed repeatedly, can negatively impact young viewers' self-esteem and body image.

Under the new guidelines, YouTube will continue to permit 13- to 17-year-olds to access content related to fitness and body image. However, the platform's algorithms will no longer recommend additional videos on these topics based on initial viewing. The decision aims to prevent young users from spiraling into "rabbit holes" of similar content that could exacerbate issues related to body image.

Dr. Garth Graham, YouTube's global head of health, noted that while such content does not violate YouTube's existing guidelines, its repeated exposure could distort a teenager’s self-perception. "As teens develop their self-identity and standards, persistent exposure to idealized images can foster unrealistic self-expectations, potentially leading to negative self-beliefs," he explained.

The change is informed by insights from YouTube’s youth and families advisory committee, which highlighted that while individual videos might seem benign, their cumulative effect could be problematic. The new restrictions will target content that idealizes physical features or body weights, such as beauty routines that promise to alter appearance or exercise regimens aimed at achieving specific looks. Content that encourages social aggression, including intimidation, will also be restricted.

This update, which has been rolled out globally including in the UK and the US, aligns with new regulatory frameworks like the Online Safety Act in the UK. This act mandates that tech companies safeguard children from harmful content and scrutinize how their algorithms might contribute to exposure of damaging material. The act specifically addresses the risk posed by algorithms that can inundate children with harmful content in a short period.

Allison Briscoe-Smith, a clinician and YouTube advisor, emphasized the importance of these "guardrails" for teenagers. "Excessive exposure to content that promotes unhealthy standards can perpetuate harmful messages, affecting how teens view themselves and their place in the world."

The changes reflect a growing recognition of the impact of digital content on mental health and underscore the need for platforms to adopt more protective measures for young users.


Lire aussi