Social media companies are failing in their duty of care to protect children from harm, the UK House of Commons culture committee has found, and is demanding a statutory code and powerful regulator.
After a year-long inquiry, the committee says social media giants such as Facebook have behaved like “digital gangsters” and new laws are needed to replace self-regulation which has failed to stop children being harmed and their privacy from being abused.
Writing for The Daily Telegraph on the eve of a visit to tech bosses in the US, Jeremy Wright, the UK Culture Secretary, declared: “The era of self regulation is over.”
MPs have proposed a “clear, legal liability” for tech companies to act against harmful and illegal content on their platforms. It is the second Commons committee to back such laws, following the science and technology committee. Royal colleges, the Children’s Commissioner, the Church of England, leading charities like the NSPCC and teachers’ leaders have all supported the campaign.
The culture committee wants its plans included in the Government’s White Paper next month which will detail how ministers could regulate social media.
Wright says “the time has come for the tech companies to be properly accountable” and confirmed he and Sajid Javid, the UK Home Secretary, were “seriously considering” a statutory duty of care.
“We will be making clear that we won’t stand by to see people unreasonably and unnecessarily exposed to harm. If it wouldn’t be acceptable offline then it should not be acceptable online,” he writes.
Earlier, asked on the BBC’s Andrew Marr Show whether tech giants should face criminal sanctions, he said: “We will consider all possible options. It’s important that those companies understand there are meaningful sanctions available to us if they don’t do what they should.”
The culture committee, which oversees the tech industry, said a compulsory Code of Ethics would set out what constituted harmful content, from self harm to cyber bullying, and would not just cover illegal material such as child abuse imagery or terrorist content.
They would be liable to act promptly to take down harmful content and to have in place internal systems to highlight and remove types of harms.
A regulator would have powers to take legal action if the code was breached, impose large fines and raid the firms to gather evidence.
This could include checking data firms had on individuals, inspecting security mechanisms and examining algorithms to ensure they were operating responsibly. Damian Collins, the committee’s Conservative chairman, said: “We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self-regulation must come to an end.
“The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.”
The committee also proposed that the social media companies should be redefined in law to make them legally liable for content identified as harmful after it had been posted by users.
“Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites,” it said.
The committee proposes a new category of tech company, which “tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher’.”