Online Harms Response: Balancing The Net – Speech, Systems & Specifics

Our Policy Director Claire Levens shares her insights on the recent consultation response from the UK Government on the Online Harms White Paper which highlights plans to protect users, especially children from a range harms.

Published in the lull between Safer Internet Day and the Cabinet Reshuffle the Government’s response to the Online Harms White Paper is a place holder. It’s a product of its time – light-touch in places, reflecting the complexity of the issues the white paper addressed, the opportunity cost of Brexit and the Ministerial musical chairs within DCMS over the last few years.

What are the plans for the new regulator?

Perhaps the most substantive part of the document is the nod to OFCOM as the future online harms regulator. The government is ‘minded’ to give this role to OFCOM, the paper says. We think this is the right decision for a number of reasons, not least that OFCOM already has many of the relationships it needed to succeed in this role.

Of course, success criteria is yet to be defined, and the regulators’ ability to meet them will be determined almost exclusively by the level of staff expertise it can call on, and the funding it will have. The scope and scale of the regulator is worthy of a blog in itself…

How will the regulator balance freedom of speech and harm?

Now we know who is regulating, we now need to know what they are regulating. Clearly there is a balance to be had between freedom of speech and harm. Where the harm is illegal, it is in some ways easier to deal with. If content has an element of child sexual exploitation or radicalisation within it, it has be to taken down and fast. That is the right, good and proper way.

The challenge is, and has always been, what to do about the content that is legal but harmful – the Online Harms White Paper sets out this point very early on. The approach DCMS have taken is to focus on the systems and processes the companies that search and show user-generated content have.  In other words, you have to enforce your own terms and conditions.  This too is right, good and proper. Some companies are working on this – others clearly have more to do.

What will the new regulations require companies to do?

Today’s publication clarifies this further – the new regulatory framework will not require the removal of specific pieces of legal content. Rather the focus will be on the companies to ensure content is compliant with their own, self-set community standards. This is useful, as it protects freedom of speech – in that there is no regulator determining what can and can’t be said. Consequently, appropriateness will be determined by platforms who will have the responsibility to remove non-complaint content;  if you are the parent of a child upset by content that is deemed to be compliant. there will be little you can do.

Why is investment in education and behaviour change just as important?

Let’s be clear, that is a policy choice and one that in and of itself, can only ever be partially successful. Why? Because unless and until there is a comprehensive and concerted effort to change our behaviour online – so that keyboard warriors are not at liberty to threaten our politicians or bullies make a vulnerable child’s life a living hell – it’s unlikely we’ll make a meaningful difference.

We have to spend the time and invest the money in understanding how to educate children, families and professionals in how to engage well online. We have to stop kidding ourselves that one-off assemblies in schools, delivered by often under-qualified people is a suitable model for digital wellbeing. We have to inspire and engage parents, teachers and professionals to engage in the online lives of the children in their care, to make online harm as socially unacceptable as drink driving.

Of course, we need the regulator to focus on what the tech companies are doing and of course that they need to do more. But we need a three-pronged focus here, otherwise we miss the most challenging element – managing our own behaviour. It’s incredibly hard, subject to ongoing mistakes and will no doubt offend some people. It’s also a critical part of the solution.

Internet Matters looks forward to continuing to work with OFCOM in its expanded role. There’s much to do.

Resources document

Read the Online Harms White Paper – Initial consultation response

visit site

Recent posts