The UK government wants the Online Safety Bill bill to be able to fine companies up to £ 18 million or ten percent of global annual turnover if they do not remove harmful content or do not remove it in time. The supervisor should also be able to block sites for this.
With the bill, the British government wants to ‘protect children’ and combat the ‘worst abuses on social media, such as racist hate crimes’. In addition, the law should give regulators more opportunities to stop financial fraud on social media and dating apps. The government states that the law guarantees freedom of expression and freedom of the press and says that the law ” will not lead to unnecessary censorship of sites and platforms.”
The proposal requires all sites and platforms covered by the new rules to take action to prevent illegal abuse. The government says that companies must take’ rapid and effective ‘ action against hate crimes, intimidation and threats directed at individuals. The sites must also comply with their own standards and conditions. They also need to consider the risks their sites pose to’ the youngest and most vulnerable people ‘and protect children from inappropriate messages and harmful activities’.
The bill will also have to include provisions that require companies to report child abuse images. For example, police forces should be able to investigate cases of child abuse more effectively.
If the tech companies do not comply sufficiently with the new law, senior executives can be personally sued if they do not comply with Ofcom’s information requests. This is not yet included in the current legislative proposal, but could be added later if the current measures prove to be insufficient.
The government stresses that the bill will ensure freedom of expression and that pluralistic online conversations will still be possible. Therefore, platforms and sites must use safeguards that protect freedom of expression. The government cites the example of human moderators who, in complex cases where context matters, have to make the choice whether something should be removed or not. In addition, people must be able to appeal effectively against decisions made by sites and platforms. These users can also complain about a decision of a platform via Ofcom.
The Category 1 services must also publish’ up-to-date reports ‘ on their impact on freedom of expression. Here they must also demonstrate that they have taken steps to counter any negative effects on freedom of expression.
These Category 1 sites and platforms are also required by law to protect ‘democratically important’ content. They must not discriminate against certain political principles and must treat all political opinions equally. Their terms of political discussion must be clearly stated and Ofcom will ensure that they comply with their own terms.
Messages on journalistic sites are excluded from this law, including comments under news items. Articles shared on sites and platforms are also excluded from the law, giving Category 1 services additional responsibility for ensuring that UK users can access journalistic content shared on their platforms. Journalists should be able to appeal rapidly against any deleted messages. The bill makes no distinction between citizen journalism and professional journalism.
Companies and services that do not comply with the new legislation risk fines of 21 million euros to 10 percent of annual worldwide turnover. Ofcom is looking at the highest amount. The full bill will be published later on Wednesday and will go to the British parliament later.