40 practical measures for tech firms to meet their duties under the Online Safety Act have been announced today by the regulator OFCOM.
These will apply to sites and apps used by UK children in areas such as social media, search and gaming. This follows consultation and research involving tens of thousands of children, parents, companies and experts.
The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
The guidelines will include personalised recommendations are children’s main pathway to encountering harmful content online. Any provider that operates a recommender system and poses a medium or high risk of harmful content must configure their algorithms to filter out harmful content from children’s feeds.
The measures will see the introduction of Effective age checks.
The riskiest services must use highly effective age assurance to identify which users are children. This means they can protect them from harmful material, while preserving adults’ rights to access legal content. That may involve preventing children from accessing the entire site or app, or only some parts or kinds of content. If services have minimum age requirements but are not using strong age checks, they must assume younger children are on their service and ensure they have an age-appropriate experience.
All sites and apps must have processes in place to review, assess and quickly tackle harmful content when they become aware of it.
While Sites and apps are required to give children more control over their online experience. This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute accounts and to disable comments on their own posts. There must be supportive information for children who may have encountered, or have searched for harmful content.
Meanwhile Children will find it straightforward to report content or complain, and providers should respond with appropriate action. Terms of service must be clear so children can understand them and all services must have a named person accountable for children’s safety, and a senior body should annually review the management of risk to children.
Ofcom Chief Executive Melanie Dawes said the changes would mean safer social media feeds for children, “with less harmful and dangerous content, protections from being contacted by strangers, and effective age checks on adult content”.