November 25, 2024

Meta, the social media giant behind the encrypted messaging app WhatsApp, is facing criticism from educators, lawmakers, and activists for its recent move to lower the minimum age for users from 16 to 13.

The decision has raised alarms about the potential risks posed to children by fostering cyberbullying, sleep disturbances, and exposure to harmful content.

Campaigners have labelled Meta’s decision as “highly irresponsible,” urging the company to reconsider its stance amidst growing concerns about the wellbeing of young users. Experts and politicians have joined forces to denounce the move, with Smartphone Free Childhood, a prominent advocacy group, leading the charge.

WhatsApp, being the second most popular platform among children, according to Ofcom, has often escaped the level of scrutiny directed at other social media platforms like TikTok or Instagram. However, experts warn that closed messaging groups on WhatsApp can be just as harmful, if not more so.

The government is now contemplating imposing restrictions on the purchase of smartphones by those under 16, reflecting the increasing concern over children’s online safety.

Daisy Greenwell, co-founder of Smartphone Free Childhood, lambasted WhatsApp for prioritising profits over children’s safety, criticising the decision to lower the age limit as “completely tone deaf.” She highlighted the platform’s potential as a gateway to more risky social media apps and its role in exposing children to cyberbullying and inappropriate content.

Vicky Ford, a member of the education select committee, condemned Meta’s unilateral decision, expressing concerns about the encryption on WhatsApp making it difficult to remove illegal content promptly.

Recent studies have shed light on the prevalence of cyberbullying within WhatsApp groups, with educators reporting instances of abuse occurring even in the early hours, impacting children’s sleep patterns and mental health.

Dr Kaitlyn Regehr from University College London warned that closed groups on platforms like WhatsApp often facilitate the sharing of harmful content, which can exacerbate issues such as misogyny and hate speech among young users.

While Meta defends its decision by citing global standards, critics argue that the move sends the wrong message about online safety to parents and educators. With calls for stricter regulations on social media use by minors gaining momentum, the debate over children’s digital wellbeing continues to intensify.

In a separate development, Meta has announced plans to introduce a nudity filter on Instagram to combat the rising threat of “sextortion,” aiming to protect users from exploitation and blackmail.

Read more:
WhatsApp Faces Backlash Over Decision to Lower Minimum Age Limit