The UK government said on Monday that it will discuss in a sequence of new online security proposals that social media owners are responsible for the harmful content posted on their platforms.
The plan is published in a policy document that also includes the formation of an self-governing regulatory body. To be used to address a variety of harmful content from encouraging violence and suicide to spreading false information and cyber bullying.
The measure is taken by Australia and Singapore to combat false news and allow social media companies to play a role in preventing the increase of harmful content online.
Owners of social networking sites, such as Facebook, but also WhatsApp-like messaging services, search engines such as Google and public forums like Reddit, would be in the front line for UK government.
It plans to oblige these companies to take appropriate measures to protect their users from contacting harmful or illegal content like provocation to violence and dissemination of violent content, including terrorism, incitement to self-injury or suicide, spread of misinformation, harassment on the internet.
The contents related to terrorism, the sexual exploitation of children and child sexual abuse will be the object of the most severe measures. Social platforms will also have to facilitate the signaling and filing of user complaints and publish an annual transparency report.
This issue has gained additional urgency with Facebook’s failure to immediately stop the live stream on March 15th in two mosques in New Zealand by self-identified white supremacists, killing 50 people.
British Prime Minister Theresa May warned technology companies that they “do not do enough” to protect users and her government intends to put legal responsibility on the company to ensure people’s safety.
“For a long time, these companies have not done enough to protect users, especially children and young people, from harmful content,” she said in a statement.
“It’s time to do things differently. Online companies must start to take responsibility for their platforms and help restore public trust in the technology.”
Regulators will also be able to issue codes of conduct that may force companies to meet certain requirements, such as hiring fact-finding inspectors, especially during elections.
“The era of online company self-regulation is over,” said digital secretary Jeremy Wright, adding that he wants the department to be part of the solution.
“Those who fail to do this will face tough action,” he declared.