The UK government is taking a hard line when it comes to online safety, appointing what it claims is the world’s first independent regulator to keep social media companies in check.
Companies that fail to live up to requirements will face huge fines, with senior directors who are proven to have been negligent of their responsibilities being held personally liable. They may also find access to their sites blocked.
The new measures, designed to make the internet a safer place, were announced jointly by the Home Office and Department of Culture, Media and Sport. The introduction of the regulator is the central recommendation of the highly anticipated government white paper, published early Monday morning in the UK.
The regulator will be tasked with ensuring social media companies are tackling a range of online problems, including:
• Inciting violence and spreading violent content (including terrorist content)
• Encouraging self-harm or suicide
• The spread of disinformation and fake news
• Cyber bullying
• Children accessing inappropriate material
• Child exploitation and abuse content
As well as applying to the major social networks, such as Facebook, YouTube and Twitter, the requirements will also have to be met by file-hosting sites, online forums, messaging services and search engines.
“For too long these companies have not done enough to protect users, especially children and young people, from harmful content,” said Prime Minister Theresa May in a statement. “We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.”
The government is currently trying to decide whether to appoint an existing regulator to the job, or to create a brand-new regulator purely for this purpose. Initially it will be funded by the tech industry, and the government is currently debating a levy for social media companies.
“The era of self-regulation for online companies is over,” said the government’s Digital Secretary Jeremy Wright in a statement. “Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.”