The UK government is contemplating a social media ban for children under 16, emphasising platforms’ responsibility to protect young users.
- A new study is set to explore social media’s impact on children’s mental health, adding to existing concerns about online safety.
- The research will contribute to enforcing the Online Safety Act by providing evidence for regulatory measures.
- There is international momentum to regulate social media, with Australia already planning similar bans.
- The UK government aims to embed safety into platform design and ensure transparency.
The UK government, under the guidance of Tech Secretary Peter Kyle, is considering implementing a social media ban for children under 16 if platforms fail to adhere to their duty of care. The ongoing global concerns about social media’s impact on children, particularly related to mental health issues like depression and self-harm, are driving these considerations.
A newly launched governmental review aims to delve deeper into the implications of social media on young people’s wellbeing. This research initiative is intended to supplement the 2019 UK Chief Medical Officers’ report, which identified links between excessive device use and mental health issues in children, though it stopped short of establishing causation.
Historically, former UK Prime Minister Rishi Sunak also considered an under-16 smartphone ban, illustrating a sustained governmental interest in this issue. The current Labour government continues to explore ways to regulate children’s digital engagement, aligning with international efforts like those in Australia, which has committed to banning social media for this age group.
Peter Kyle stated, “This research will help build the evidence base we need to keep children safe online,” highlighting the study’s role in informing policy. The Online Safety Act, which enhances Ofcom’s authority to penalise social media firms for hosting harmful content, underpins these efforts. This act mandates increased safety measures to protect young users from harmful material.
Since the enforcement of the Online Safety Act, Ofcom has gathered input to optimise its enforcement strategy, with Kyle providing guidance on key priorities. These include integrating safety in platform design, ensuring corporate transparency, adapting regulations to emerging threats, and creating a resilient and inclusive digital space.
Past incidents, such as a leak of Meta’s internal reports suggesting it was aware of Instagram’s negative effects on teenage girls, have heightened calls for stricter regulation. Companies like Meta and TikTok have since made changes, including improved parental controls and age verification systems. TikTok, notably, faced a substantial fine by Ofcom for inaccurate parental controls, signalling regulatory bodies’ willingness to act decisively.
The UK government’s investigation into social media’s effects on children is a pivotal step towards enhancing online safety for younger users.