Ofcom given new social media powers to further protect children online

  • Last week the UK government officially announced that media regulator Ofcom would be given power to fine social media platforms that do not protect users from harmful content, in a particular bid to protect children online.

    This was detailed in the government’s updated Online Harms White Paper. The paper was launched in April 2019, but has had its proposals explained more in-depth as of last Wednesday.

    Ofcom will be able to prevent and ensure companies remove illegal content from sites, with specific focus on child abuse imagery and terrorism.

    The regulatory body will also warrant social networks to take action in banning and removing harmful content as stated in their own terms and conditions. For example, if a site says that promotion self-harm is banned; it will be required to take action to enforce that.

    According to the Guardian, “the government wants the ability to use penalties to encourage speedy enforcement, and discourage companies from deliberately turning a blind eye to their own platforms.”

    RELATED: Over half of NI kids have seen hateful content online

    The newspaper added that social networks have warned users might notice the platforms “becoming more censorious” as the “requirements on them to take down content quickly runs the risk of encouraging them to remove false positives – material that is not actually infringing, but looks like it might be close.”

    Ofcom's own recent study found that 62% of children in Northern Ireland claim to have seen hateful content online, to which a spokesperson for NSPCC Northern Ireland responded: “It is sadly unsurprising that Northern Ireland parents feel the internet does more harm than good when social networks’ algorithms are designed to push even the most dangerous suicide and self-harm content at children."

    RELATED: NI stresses importance of cyber security on #SaferInternetDay2020

    Since the initial white paper release last April, the National Society for the Prevention of Cruelty to Children (NSPCC) estimates that 90 online child sexual offences have taken place every day and said it has been campaigning for statutory regulation of social networks since 2017. 

    Its CEO Peter Wanless commented:  "The government has signalled they are willing to stand up to Silicon Valley and commit to landmark British regulation that could set a global standard in protecting children online.

    “For far too long the safety of children has been an inconvenience for social media companies who have left them exposed to harmful content, grooming and abuse.  Tech giants will only do all that they should to stop groomers abusing children on their sites if the penalties for failure are game-changing.

    RELATED: ICO publishes new online Code of Practice to protect the privacy of children

    “Ministers must now move urgently to get a proactive duty of care onto the statute books that give Ofcom the powers to lift up the bonnet on social networks, impose hefty fines on rogue companies and hold named directors criminally accountable for putting children at risk.”

    In early 2019 the charity published a detailed blueprint for new social media laws, which included an independent regulator, safe accounts for children and detailed reporting on how the social platforms are keeping children safe.

    About the author

    Niamh is a Sync NI writer with a previous background of working in FinTech and financial crime. She has a special interest in sports and emerging technologies. To connect with Niamh, feel free to send her an email or connect on Twitter.

    Got a news-related tip you’d like to see covered on Sync NI? Email the editorial team for our consideration.

    Sign up now for a FREE weekly newsletter showcasing the latest news, jobs and events in NI’s tech sector.

Share this story