The regulator will be able to penalise firms that fail to establish robust age verification checks and parental controls that ensure young children are not exposed to video content that “impairs their physical, mental or moral development.”
Tech firms are set to face fines from Ofcom for showing potentially harmful videos online, in the Government's first official crackdown on social media.
The proposal would give Ofcom the power to impose multi-million pound fines upon companies, if it judges the platforms have failed to prevent youngsters seeing 'harmful' content. This includes pornography, violence and child abuse.
The broadcasting watchdog is set to take charge of the matter from 19 September 2020. It may not be required however, if Brexit occurs in October, as the move is designed to meet the UK's obligations to the EU.
A spokesman for the Department for Digital, Culture, Media and Sport said: "The implementation of the AVMSD [Audiovisual Media Services Directive] is required as part of the United Kingdom’s obligations arising from its membership of the European Union and until the UK formally leaves the European Union all of its obligations remain in force. If the UK leaves the European Union without a deal, we will not be bound to transpose the AVMSD into UK law."
The regulator will be able to penalise firms that fail to establish robust age verification checks and parental controls that ensure young children are not exposed to video content that “impairs their physical, mental or moral development.”
The Telegraph originally reported that the proposal was "quietly" agreed before Parliament's summer break and would give Ofcom the power to fine tech firms up to 5% of their revenues and/or "suspend or restrict" them in the UK if they failed to comply with its rulings.
The appointment of Ofcom is an interim measure for regulation until a separate online harms regulator is appointed at a later time.
Ofcom is ready to accept the current role, with a spokeswoman telling the BBC: "These new rules are an important first step in regulating video-sharing online, and we'll work closely with the government to implement them. We also support plans to go further and legislate for a wider set of protections, including a duty of care for online companies towards their users."
Daniel Dyball, The Internet Association's executive director said: "Any new regulation should be targeted at specific harms, and be technically possible to implement in practice - taking into account that resources available vary between companies."
This hope for any intervention being proportionate was seconded by TechUK, the industry group that represents the technology sector.
It is often debated within the tech industry that the mass of video content posted daily on various platforms is too difficult to be reviewed manually and individually. YouTube has previously tried to tackle this issue by implementing the app, YouTube Kids, to let children view videos in a more contained environment. It still however, is susceptible to flaws.
“We use a mix of filters, user feedback and human reviewers to keep the videos in YouTube Kids family friendly,” the YouTube Kids landing page says. “But no system is perfect and inappropriate videos can slip through.”
Andy Burrows, head of the NSPCC's child safety online policy welcomed the news: "Crucially, this is a real chance to bring in legislative protections ahead of the forthcoming Online Harms Bill and to finally hold sites to account if they put children at risk."
The public demand for prosecution of social media bosses regarding child safety breaches has been ongoing for some time. A poll conducted by the NSPCC in April 2019 found that more than three quarters of British adults said directors oftech giants should be prosecuted if they breached the proposed new statutory duty of care on firms to protect children from online harms.
A more recent NSPCC survey published in July 2019 revealed that nine in ten children also agreed that tech firms have a legal responsibility to keep them safe online.
Source: BBC News