For the first time, tech companies are being made legally responsible for online harm against children. But do the new rules go far enough in protecting young people, and what do CPs and parents need to know? Journalist Jo Waters reports.

Protecting children and young people from dangerous and inappropriate online content is the ambitious aim of the Online Safety Act (OSA) (Department for Science, Innovation & Technology (DSIT), 2023). It’s a long-awaited government move to clamp down on tech companies who fail to regulate access to their content.
The new rules for child online safety drawn up by Ofcom – the regulator of the OSA – were set to come into force on 25 July (Ofcom, 2025a). The OSA extends and applies to the whole of the UK (with a few specific variations in some countries (DSIT, 2023)). From this day (25 July 2025) onwards, tech companies behind sites and apps used by UK children (in areas such as social media, search, and gaming), will be made legally responsible if their content harms children (See What the Online Safety Act says, below – feature end).
Ofcom (2025a) has laid down more than 40 practical measures for tech firms to meet their duties under the OSA. This followed consultation and research involving tens of thousands of children, parents, companies and experts.
Some key rules include stricter age checks for apps, and algorithms being adjusted to filter out harmful content from children’s feeds. Examples include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Plus, taking quick action when dangerous content is identified. Online services must act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
If tech companies don’t comply with these new rules, Ofcom has the power to impose substantial fines (up to 10% of global revenue), and, in extreme cases, even ban them from operating in the UK.
With 60% of 13- to 17-year olds encountering ‘potentially harmful content’ online in Ofcom’s own research (Ofcom, 2025b), and nine in 10 children owning a phone by the time they are 11 (UK Parliament 2024a), the OSA clearly needs to fulfil its objectives. What’s more, 20% of mobile device owners are as young as three to five (Ofcom, 2025c). The OSA however, has already been criticised.
‘WE KNOW THAT ALGORITHMS WILL BOMBARD YOUNG PEOPLE STRUGGLING WITH THEIR MENTAL HEALTH, EVEN IF THEY ONLY SEARCHED FOR CONTENT ONCE OR TWICE’
ARE THE NEW RULES ENOUGH?
Dame Rachel de Souza, the children’s commissioner for England, says that while the OSA is a significant step forward in protecting children online, it does not go far enough. She argues that while it places stronger duties on tech companies to protect young users and address harmful content, it leaves gaps, especially around enforcement timelines and how companies will be held accountable.
‘Tech companies have proven time and time again that keeping children safe on their platforms is not a priority – they prioritise profits over safety, leaving children and parents to police the harmful content young users are frequently exposed to,’ says Rachel.
‘The act is a starting point, but not the only solution. Parents must stay engaged, talk openly about children’s online experiences, while trusted adults like community practitioners (CPs) must also play a key role in spotting harm and supporting families.’
Katie Freeman-Tayler, head of research and policy at not-for-profit Internet Matters, which is internet provider-funded, is optimistic that the OSA will be a positive step for families. This applies especially to vulnerable children, who she highlights, have more negative experiences online.
‘While the OSA will not eliminate all risks online, it should reduce children’s exposure to harmful content,’ says Katie. ‘Many platforms recommend content based on what their users view, search or click on. Under the act, those systems must not actively push harmful content to children. For example, if a child watches a video that shames a certain body shape, the platform should not keep showing more of the same. The aim is to reduce the volume of such content children are likely to see.’
However, while head of policy at the UK Safer Internet Centre (a partnership of three internet safety charities) Charlotte Aynsley welcomes the OSA, she urges caution. She says there’s some criticism that Ofcom’s code doesn’t go far enough in terms of forcing platforms to adopt some of the ‘safer by design’ principles. ‘It’s important to recognise that the act doesn’t deal with individual pieces of content; it deals with systems and processes’, says Charlotte who is also an independent safeguarding consultant. ‘The act is trying to ensure that systems and processes are risk assessed rather than the content itself.’
THE YOUNG LIFE ONLINE
37% of three- to five-year-olds use social media – up from 29% last year
31% of 8- to 12-year-olds who go online have seen something they find worrying or nasty
52% of 11- to 14-year-old boys are aware of and have engaged with influencers in the ‘manosphere’ (based on extreme misogyny)
99% of children spend time online
Ofcom, 2025a and 2025c; UK Parliament, 2024a.
CLEAR CHALLENGES
‘Our overall view on the OSA is that it’s a really important piece of child protection legislation,’ says Rani Govender, the NSPCC’s policy manager for child safety.
‘Up until we had the act, we were basically relying on tech companies self-regulating themselves and implementing safeguards for children, but that clearly failed with really disturbing consequences,’ says Rani.
‘We saw increasing levels of child sexual abuse online taking place year on year, and children exposed to really dangerous content which can have a really significant impact on their wellbeing and their mental health.’
Rani says that Ofcom has started implementing the OSA quickly, introducing core duties on services to tackle illegal harm, and content which is legal, but harmful for children (as the OSA outlines). ‘We have seen some really important changes coming forward and a shift in responsibility firmly on services to keep their sites safe,’ she says.
Challenges remain though, Rani highlights, in both the OSA and Ofcom’s approach to enforcing the legislation. ‘Ofcom does have strong enforcement powers, which we really support, including fining tech companies. [But] we need to make sure that the government back the regulators to be really strong with these enforcement powers,’ says Rani.
The NSPCC has also detected a gap in the OSA, she reveals: ‘We don’t think there is enough focus on children having age-appropriate experiences, particularly around the issue of enforcing minimum age limits,’ says Rani. ‘Often websites will have a minimum age of 13 and they are not enforcing these effectively at the minute. This is allowing young children to be in spaces where they shouldn’t be.’
At the time of speaking, Rani highlighted that Ofcom had not yet recommended any measures to tech companies to protect children from live streaming events, but was hopeful it would be covered in later updates. As of 30 June, proposals for such measures have now been included by Ofcom, but for publication by next summer (see updates).
The NSPCC would also like to see formal, established and funded mechanisms in place to ensure children’s voices can be listened to and fed into the decision-making process. Addictive functions is one of the issues young people raise. ‘One of the central gaps that is emerging is the challenge around the so-called “addictive nature” of these services,’ explains Rani. ‘We know that services design their platforms intentionally in a way to retain and increase user engagement with functions such as endless scrolling and persistent notifications to get you to reopen an app.
‘At the moment, these are not within Ofcom’s remit, but we know from what children and young people raise with us that this really impacts their experience of these apps and their wellbeing.’
THE GREATEST DANGERS
Some of the most disturbing online content promotes suicide. Andy Burrows is chief executive of the Molly Rose Foundation, which was set up in memory of Molly Russell who died by suicide aged 14 in 2017 after viewing thousands of images promoting suicide. He says the charity is ‘bitterly disappointed’ by how Ofcom has implemented the OSA. ‘Where we need to see ambition from the regulator, we have timidity and a set of sticking-plaster approaches,’ he says.
Andy says it is clearly significant that we do now have online safety legislation on the statute books. But he also says that the first set of measures Ofcom states it wants companies to take – covering both illegal harms such as child sexual abuse and harmful content for children – have substantial omissions.
‘A really obvious example is live streaming and video chat, which is very high risk for children across a range of harms, not least grooming,’ says Andy. ‘It seems inexplicable that Ofcom themselves describe these as “high risk” but then fail to require platforms to do anything about it.’ (Though additional proposals for next summer have since been made by Ofcom on live streaming).
Similarly to the NSPCC, Andy says the charity is also concerned about a lack of control on auto scrolling, where young people are fed a never-ending stream of recommended posts and videos. He says these are designed to keep a young person on the platform for as long as possible, so they don’t want to leave and miss something.
‘I think what is of greatest concern to us is Ofcom’s failure to go far enough in controlling the algorithmic spread of harmful content. We know that Molly Russell was recommended 2000 pieces of suicide and self-harm content in the six months before she died and that precious little has changed in the seven years since her death. At the moment, we are losing one child aged 10 to 19 every week to suicide where technology plays a role and that’s likely an underestimate.
‘We know that algorithms will bombard young people struggling with their mental health, even if they only searched for content once or twice.’
PARENTS NEED TO BE VIGILANT
With critics saying the OSA has more to do to protect children and young people, what should parents do in the meantime, and can you help advise them?
‘We wouldn’t allow our children to sit in a room with 200 strangers, but somehow we feel it’s okay to let them online where strangers can potentially access their details and message them’, says CPHVA Executive member Rhian Ogden.
Rhian says parents need to be made more aware of the risks of letting their child join private messaging groups, playing games online or having uncontrolled access to online content.
‘I think if parents are better informed about the potential risks, they might make an informed decision on how early to give a child access to a smart phone or the internet,’ says Rhian, who is also lecturer in child nursing at the University of Leeds.
CPHVA Executive member Kate Phillips agrees it’s important for parents to learn about parental controls (see Resources) that the OSA says apps and digital platforms need to have to protect children.
Kate says CPs can advise parents on having open, honest and age-appropriate conversations with their children about online safety. These can cover the importance of monitoring what content they are viewing, as well as setting up passwords and other parental controls.
‘I also think parents need to be vigilant when their children are reaching a more independent age and have their own smartphone devices and access to messaging platforms,’ says Kate, who is also a HV and lecturer in child nursing at the University of Leeds.
‘There should be clear boundaries set between that child and parent,’ Kate continues. ‘Young people* should be aware parents will have visible access to their electronic devices so that parents can be fully informed of what is being sent, received and accessed. This is particularly important if there has been a change in their child’s behaviour or mood, or a decline in their school performance, which might be an indication that they are experiencing potential harm or abuse.’
*up to 18, or beyond depending on their awareness of risk and perceived level of vulnerability.
HOW YOU CAN HELP
CPHVA Executive member Rhian Ogden, a lecturer in child nursing at the University of Leeds, says online safety issues targeted by the OSA are a good start but are tackling the ‘tip of an iceberg of issues.’
‘This is all totally unprecedented,’ says Rhian. ‘We’ve never had to manage a generation of children and young people who have such an online presence and are as exposed as they are.’
Rhian says CPs need to be aware of the OSA so they can advise young people and parents of younger children about the new safety rules. ‘They might hear something from somebody they are working with that raises concerns and can then signpost them on how to make a complaint and/or get material taken down for instance.
‘CPs can also make young people aware of the law within the OSA. If they are involved in creating, distributing or sharing inappropriate content that has been sent to them on a social messaging app – such as nude images – that is a criminal offence and can be investigated by police.’
Rhian says that it can be overwhelming for parents to navigate. ‘Health professionals can help with this (see Parents need to be vigilant, above) but we need to be educated, and up to date to be able to signpost them to the right information.’
Kate Phillips, a health visitor and lecturer in child nursing at the University of Leeds and fellow CPHVA Executive member, agrees. ‘CPs are in a key position to give advice to parents, but the potential risks are so new and evolving. I think CPs need to be upskilled and educated, so that they can give fully informed evidence-based information. This should be a priority for the governing health departments in each UK country.’
Kate expands: ‘Parental awareness and health literacy around the issue of online safety should be improved and prioritised and this public health approach should be embedded as part of the different healthy child programmes, that are implemented throughout the four nations, to ensure online safety discussions form part of SCPHN preventive work.’
‘CPs ARE IN A KEY POSITION TO GIVE ADVICE TO PARENTS, BUT THE POTENTIAL RISKS ARE SO NEW AND EVOLVING. CPs NEED TO BE UPSKILLED AND EDUCATED’
WHAT ABOUT NEW THREATS?
Dame Melanie Dawes, Ofcom chief executive, says of the OSA: ‘These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content.
‘Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act, they will face enforcement,’ Melanie adds.
But Andy fears the online situation is getting worse with the rapid introduction of artificial intelligence (AI) and deep fakes – as AI products are being pushed out without even basic AI safeguarding, he says.
‘Ultimately, we think the government needs to step in and fix this. There is a preventable harm that is happening on its watch. Ofcom need to go much further and faster,’ he adds. We want to see a commitment from the prime minister for a new OSA in 2026.’
Charlotte Aynsley from UK Safer Internet, agrees that harm changes over time and says it’s important to keep a watching brief on AI. ‘We have to be vigilant, we can’t think we’ve done this – tick – and don’t need to worry about it anymore. The act is a good start, but we’d like to see the next iteration and for it to go further.’
WHAT THE ONLINE SAFETY ACT (OSA) SAYS
Measures include:
Safer feeds Personalised recommendations are children’s main pathway to encountering harmful content online. Any provider that operates a recommender system and poses a medium or high risk of harmful content must configure its algorithms to filter out harmful content from children’s feeds.
Solid age checks The riskiest services must use highly effective age assurance to identify which users are children. This means they can protect them from harmful material, while preserving adults’ rights to access legal content.
Fast action All sites and apps must have processes in place to review, assess, and quickly tackle harmful content when they become aware of it.
Support for children Sites and apps are required to give children more control over their online experience. This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, block and mute accounts and disable comments. There must be supportive information for children who have encountered or searched for harmful content.
Easier complaints Children will find it straightforward to report content or complain, and providers should respond with appropriate action.
Strong governance All services must have a named person accountable for children’s safety, and a senior body should annually review the management of risk to children.
Source: Ofcom, 2025a
Also see implementation plans
WHAT DOES META SAY?
The social technology company’s apps include Facebook, Instagram and WhatsApp.
In December 2024, Meta submitted written evidence in response to the Science, Innovation and Technology Committee inquiry into social media, misinformation and harmful algorithms (UK Parliament, 2024b). Their response included:
‘We have been supportive of the act as we believe the framework largely avoids many of the mistakes seen in some other jurisdictions where policymakers have tried to too tightly prescribe what must be done in relation to certain specific harms or focus down on individual instances of harmful content online.
‘Our experience is that the more prescriptive the regulatory requirements and the greater the focus on individual pieces of content, the slower and more complex the decision-making for content reviews.’
RESOURCES
For CPs and to signpost young people
- Report/Remove Childline’s online tool allows children and young people to have any nude images of themselves that have been shared online, removed
- Professionals Online Safety Helpline – UK Safer Internet Centre
- NSPCC Resources for health professionals to keep children safe online
- Talking to children about online safety
- Parental controls and support from Ofcom
- Molly Rose Foundation
This feature was compiled before the Online Safety Act came into force.
SHARE YOUR EXPERIENCES
If you have insights into helping young people and parents navigate the online world, get in touch with editor Aviva Attias aviva@communitypractitioner.co.uk
REFERENCES
Department for Science, Innovation & Technology. (2023) Online Safety Act 2023. See: legislation.gov.uk/ukpga/2023/50/contents (accessed 25 June 2025).
Ofcom. (2025a) New rules for a safer generation of children online. See: ofcom.org.uk/online-safety/protecting-children/new-rules-for-a-safer-generation-of-children-online (accessed 25 June 2025).
Ofcom. (2025b) Protecting children from harms online. See: bit.ly/4l4uhLA (accessed 25 June 2025).
Ofcom. (2025c) Top trends from our latest look at the UK’s media lives. See: ofcom.org.uk/media-use-and-attitudes/media-habits-adults/top-trends-from-our-latest-look-at-the-uks-media-lives (accessed 25 June 2025).
UK Parliament. (2024a) The impact of smartphones and social media on children. See: https://commonslibrary.parliament.uk/research-briefings/cdp-2024-0103/ (accessed 27 June 2025).
UK Parliament. (2024b) Written evidence submitted by Meta (SMH0037). See: https://committees.parliament.uk/writtenevidence/132928/pdf/ (accessed 27 June 2025).
Image | Freepik