Two thirds of teens and young adults have recently encountered at least one potentially harmful piece of content online, but only around one in six go on to report it, Ofcom has found.

The findings come as the Government’s Online Safety Bill continues to make its way through Parliament. Ofcom has said that it will enforce these new laws, and has already started regulating video sharing platforms established in the UK – such as TikTok, Snapchat and Twitch.

The regulator has joined forces with social media influencer Lewis Leigh, and behavioural psychologist – Jo Hemmings, to launch a new campaign. The social media campaign aims to reach young people on the sites and apps they use regularly, to highlight the importance of reporting posts they may find harmful.

Ofcom’s Online Experiences Tracker reveals that most younger people aged between 13 and 24  believe the overall benefits of being online outweigh the risks. But around the same proportion  have encountered potentially harmful content.

Younger people told us that the most common potential harms they came across online were: offensive or ‘bad’ language (28%); misinformation (23%); scams, fraud and phishing (22%); unwelcome friend or follow requests (21%) and trolling (17%).

A significant number of young people  also encountered bullying, abusive behaviour and threats; violent content; and hateful, offensive or discriminatory content, targeted at a group or individual based on their specific characteristics.

But research reveals a worrying gap between those who experience harm online and those who flag or report it to the services. Fewer than one in five young peopletake action to report potentially harmful content when they see it.

Younger participants say the main reason for not reporting is that they didn’t see the need to do anything (29%); while one in five (21%) do not think it will make a difference. Over one in ten  say they don’t know what to do, or whom to inform.

User reporting is one important way to ensure more people are protected from harm online. For example, TikTok’s transparency report shows that of the 85.8 million pieces of content removed in the last quarter of 2021, nearly 5% were removed as a result of users reporting or flagging content. In the same period, Instagram reported 43.8 million content removals, of which about 6.6% were removed as a result of users reporting or flagging content.

“With young people spending so much of their time online, the exposure to harmful content can unknowingly desensitise them to its hurtful impact. People react very differently when they see something harmful in real life – reporting it to the police or asking for help from a friend, parent or guardian – but often take very little action when they see the same thing in the virtual world.

What is clear from the research is that while a potential harm experienced just once may have little negative impact, when experienced time and time again, these experiences can cause significant damage. Worryingly, nearly a third of 13-to-17 year olds didn’t report potentially harmful content because they didn’t consider it bad enough to do something about. This risks a potentially serious issue going unchallenged.

That is why I’m working with Ofcom to help encourage people to think about the content they or their children are being exposed to online, and report it when they do, so that the online world can be a safer space for everyone.” said Behavioural and Media Psychologist Jo Hemmings

Ofcom’s new campaign, which launches today, aims to help address this lack of reporting. It will feature TikTok influencer Lewis Leigh, who rose to fame during lockdown with his viral TikTok videos showing him teaching his nan, Phyllis, dance moves.

The campaign aims to show young people that, by taking a moment to stop, think and flag problematic content – rather than scrolling past – – they can really make an important difference in helping to keep their online communities safer.

LEAVE A REPLY

Please enter your comment!
Please enter your name here