Joint Committee’s extensive recommendations to improve Online Safety Bill
In case you missed it, the Joint Committee responsible for reviewing the controversial Online Safety Bill, recently published its recommendations.
The Joint Committee has undertaken the gargantuan task of listening to the concerns raised by a wide cross section of society, considering how the Online Safety Bill can be improved and making specific recommendations to amend the Bill – all within a tight timeframe.
Its starting point, the current Online Safety Bill, was frankly a bit of a hodgepodge, making the task all the more difficult. However, the Joint Committee has recommended extensive, important, pragmatic and detailed changes to the Bill that should make it fit for purpose, provided those recommendations are taken forward.
The Joint Committee has made extensive recommendations, including to bring very harmful paid advertisements within the scope of the Online Safety Bill, that will be regulated by Ofcom. It has also made recommendations to improve the democratic credentials of the new regime, diluting the Secretary of State’s (and government of the day’s) power to amend the rules on a whim, but it has also recommended that Ofcom be given even greater powers. It has underscored the importance of protecting the freedom of the press, however, I very much hope that the views of ordinary citizens will not be eclipsed by journalists and news outlets that want to push their agenda. The press continues to play an important role in our democratic society, but so do ordinary people!
More commentary will follow, but in the meantime, the Joint Committee’s recommendations are summarised as follows:
- Ofcom should draw up mandatory Codes of Practice for internet service providers. For example, they should write a Code of Conduct on risk areas like Child Exploitation and terrorism. They should also be able to introduce additional Codes as new features or problem areas arise, so the legislation doesn’t become outdated as technology develops.
- They should require the service providers to conduct internal risk assessments to record reasonable foreseeable threats to user safety, including the potential harmful impact of algorithms, not just content.
- The new regulatory regime must contain robust protections for freedom of expression, including an automatic exemption for recognised news publishers, and acknowledge that journalism and public interest speech are fundamental to democracy. [Note: while I agree with this principle, I do not want there to be a two tier system in which ordinary people are given significantly weaker rights than journalists – including the right to respond or provide a counter argument via social media in relation to articles published by journalists!]
- Scams and fraud generated in an aim to tackle harmful advertising such as scam adverts. Paid-for advertising should be covered by the Bill.
- Service providers should be required to create an Online Safety Policy for users to agree with, similar to their terms of conditions of service.
- Big tech must face sanctions if they fail to obey the Online Safety Act, when it is passed, and comply with Ofcom as the UK regulator.
- Ofcom’s powers to investigate, audit and fine the companies found in breach of the new rules should be increased.
The Committee also believes the Bill should be clearer about what is specifically illegal online. They believe it should not be up to the tech companies to determine this. The Committee therefore agrees with the Law Commission’s recommendations about adding new criminal offences to the Bill.
They recommend that:
- Cyberflashing be made illegal.
- Deliberately sending flashing images to people with photosensitive epilepsy with the intention of inducing a seizure be made illegal (known as Zach’s law).
- Pornography sites will have legal duties to keep children off them regardless of whether they host user-to-user content.
- Content or activity promoting self-harm be made illegal, such as it already is for suicide.
Further, the report recommends that individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law. They also recommended that a senior manager at board level or reporting to the board should be designated the “Safety Controller.” In that role they would be made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.
Side note
Another committee of MPs is still looking into the same Online Safety Bill, and will report later on its own findings and recommendations. I only hope that doesn’t muddy the water if they do end up suggesting. a different approach to the definition of ‘online harms’, for example.
The Committee has set out recommendations to bring more offences clearly within the scope of the Online Safety Bill, give Ofcom the power in law to set minimum safety standards for the services they will regulate, and to take enforcement action against companies if they don’t comply.”
– Damian Collins, MP, Chair of the Joint Committee on the draft Online Safety Bill