Ofcom is a communications watchdog for the UK, responsible for online safety, but its new regulation in the Online Safety Act potentially affects thousands of companies and platforms around the world. Ofcom’s latest act has now become law and guidance in the act particularly affects the friend-finding industry. Ofcom guidance says that companies should not be including under age users in ‘suggested friends’ functions on social apps, all in a battle to combat child sexual exploitation and abuse.
Ofcom have damning figures that show that one in ten 11-18 year olds have been sent naked or semi-naked images over online platforms. Social online spaces are a place lots of young people dedicate abundant time to. So unfortunately, bad actors and perpetrators naturally flock there too. To prevent grooming of young people, Ofcom guidance says ‘suggested friends’ functions involving underage people should be removed from apps, because they often help facilitate grooming.
This also highlights why so many of the friend-finding and social discovery apps we have reported on here, put a huge emphasis on age-verification in the profile setting up process of their apps. It can be annoying for users to jump through extra hoops to prove they are the age they say they are – but proper verification can ensure that users are appropriately matched and exposed to the right audience on these platforms. Ultimately, it’s about ensuring social networks and platforms are safe spaces, to help people connect and share safely.
Ofcom’s regulations will affect thousands of platforms in the UK and further abroad, including an estimated 20,000 small businesses that will need to comply. While this could be hard for all companies, it is important and Dame Melanie, Chief Executive of Ofcom said:
“It isn’t the job of a regulator to be loved by everybody. That’s impossible. And it’s not what we ever aim for, but it is our job to be proportionate. And to make sure that what we require is evidenced and has been backed up by proper facts.”
Ofcom’s thousand plus page regulation also includes guidance about moderation teams on platforms, accessibility of complaints and reporting, the removal of terrorist backed accounts, and identifying child abuse websites and images. Some have criticised the regulation, suggesting it does not push safety forward but only enshrines what is already being done by major players in the space.
Social discovery apps nearly all have the same goal – help people establish meaningful connections with others that boost their mental health. That’s why security and safety on these platforms has to be of major importance and priority. When it’s not there, these apps unfortunately can become sources of distress and trauma.