What online safety standards will be introduced?
- A new statutory ‘duty of care’ to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.
- Further stringent requirements on tech companies to ensure child abuse and terrorist content is not disseminated online.
- Giving a regulator the power to force social media platforms and others to publish annual transparency reports on the amount of harmful content on their platforms and what they are doing to address this.
- Making companies respond to users’ complaints, and act to address them quickly.
- Codes of practice, issued by the regulator, which could include measures such as requirements to minimise the spread of misleading and harmful disinformation with dedicated fact checkers, particularly during election periods.
- A new “Safety by Design” framework to help companies incorporate online safety features in new apps and platforms from the start.
- A media literacy strategy to equip people with the knowledge to recognise and deal with a range of deceptive and malicious behaviours online, including catfishing, grooming, and extremism
How will the new online safety standards be enforced?
As part of the Online Harms White Paper, a joint proposal from the Department for Digital, Culture, Media and Sport and Home Office, a new independent regulator will be introduced to ensure companies meet their responsibilities.
The Government says the regulator will have effective enforcement powers – and we look forward to seeing the details in the consultation paper. Internet Matters will be responding to the consultation and we will publish our response here.
Why are these laws being proposed?
Prime Minister Theresa May said:
“The internet can be brilliant at connecting people across the world – but for too long these companies have not done enough to protect users, especially children, and young people, from harmful content.
“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe. Making the internet safe for all
Digital Secretary Jeremy Wright said:
”We want the UK to be the safest place in the world to go online, and the best place to start and grow a digital business and our proposals for new laws will help make sure everyone in our country can enjoy the Internet safely.”
Education and awareness still key to supporting parents
Carolyn Bunting, CEO, Internet Matters, said:
“We support the government’s desire to make the UK the safest place to be online. The internet simply wasn’t built with children in mind, so it is vital that government plays a greater role in determining and setting standards for the services that children commonly use, and that industry responds quickly and effectively.
“Proactive regulation and better technical solutions, whilst welcomed, are just one part of the solution. We have to help parents to have greater awareness and understanding of their child’s digital wellbeing. It would be unfair to leave those parents or guardians to figure it out for themselves. Instead, we must make available as many accessible, simple resources for parents based on expert advice which makes it as easy as possible for them to understand.”
Tech to be used as part of the solution to a safer internet
Recognising that the Internet can be a tremendous force for good, and that technology will be an integral part of any solution, the new plans have been designed to promote a culture of continuous improvement among companies. The new regime will ensure that online firms are incentivised to develop and share new technological solutions, like Google’s “Family Link” and Apple’s Screen Time app, rather than just complying with minimum requirements.