Parliamentary Committee Publishes Proposals To Tackle Online Harm And End Self Regulation

Parliamentary Committee Publishes Proposals To Tackle Online Harm And End Self Regulation

By Gabriel Princewill-

The Joint Committee on the controversial Draft Online Safety Bill (OSB), set up to help establish the regulatory framework to help Ofcom tackle “harmful” content online, has today published its long awaited report. 

MPs and peers who welcome a more comprehensive approach to tackling illegal content inckuding hate speech, argue that regulatory guidance from a public body will “provide an additional safeguard for freedom of expression in how providers fulfil this requirement”.

The committee broadly welcomed the government’s objective to supersede the self regulating rules of the industry by enforcing compliance with a set of rules intended to hold tech giants accountable for the content they spread and monetize.

It intends to do this through a series of codes of practice, in conjunction with the media regulator, Ofcom, empowering the watchdog with a new major oversight and enforcement role over Internet content.

The draft Online Safety Bill (OSB), which will be enforced by Ofcom through a new Code of Practice (CoP) aims to overhaul the pitfalls of the internet, but equally faces challenges in doing so effectively .

Embodied in its chief plans are  multiple goals, including preventing the spread of fake news,  harmful misinformation and conspiracy theories, bullying, terrorism, and  safeguard the interest of children,  ensuring a safe internet for all, whilst maintaining the right to free expression within reasonable bands.

Regulating the intertnet in terms of fake news will necessitate a process of verification and demonstrating the fallacy of published content, as well as having a credible body to ascertain the facts of news published.

The same could go for misinfomation which at the ebst of times can be subjective, though ascertaining factual information should not be difficult for intelligent and objective individuals. The process of validation could be an interesting observation for a noble goal as that purported.

Amongst the proposals, yet to be consolidated and finalised, are for Ofcom to set the standards by which big tech will be held accountable. Their powers to investigate, audit and fine the companies should be increased.

Mandatory Codes Of practise

Ofcom  will also be expected to draw up mandatory Codes of Practice for internet service providers, and introduce  additional Codes as new features or problem areas arise, so the legislation doesn’t become outdated as technology develops.

The broadcasting regulator will also be expected to require the service providers to conduct internal risk assessments to record reasonable foreseeable threats to user safety, including the potential harmful impact of algorithms, not just content.

The new regulatory regime will contain robust protections for freedom of expression, acknowledging the fundamental role of journalism to a democratic society like the Uk.

It will also include  tackling harmful advertising such as scam adverts. Paid-for advertising should be covered by the Bill.

Clear Definiton Of Illegal Content

The Committee also believes the Bill should be clearer about what is specifically illegal online, and concurred with the Law Commission’s recommendations about adding new criminal offences to the Bill.

They include  cyberflashing be made illegal, and the deliberate sending of flashing images to people with photosensitive epilepsy with the intention of inducing a seizure be made illegal (known as Zach’s law).

They also include the imposing of a duty on pornography sites to keep children off them regardless of whether they host user-to-user content.

Content or activity promoting self-harm be made illegal, such as it already is for suicide.

The report recommends that individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law.

They also recommended that a senior manager at board level or reporting to the board should be designated the “Safety Controller.” In that role they would be made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.

The report also calls for the bill to tackle “harmful advertising” and clarify the definition of “illegal content”.

Under the plans, online content creators will have to have a registered  ID and a verification process to authenticate their identity.

Ombudsman

U.K. lawmakers further suggest individual internet users should be able to make complaints to an ombudsman when platforms fail to comply with the new law — and recommend that regulated platforms are required to have a senior manager at board level or reporting to the board who is designated the “Safety Controller.”

“In that role they would be made liable for a new offence: The failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users,” the committee suggests.

Companies that fail to comply with the rules could face fines of up to £18m, or 10% of their annual global turnover, whichever is highest. But it’s unclear how this might apply to smaller sites.
Damian Collins MP, Chair of the Joint Committee, said:

“The Committee were unanimous in their conclusion that we need to call time on the Wild West online. What’s illegal offline should be regulated online. For too long, big tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life.

The Committee has set out recommendations to bring more offences clearly within the scope of the Online Safety Bill, give Ofcom the power in law to set minimum safety standards for the services they will regulate, and to take enforcement action against companies if they don’t comply.

The era of self-regulation for big tech has come to an end. The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.”

End Of Self Regulation

In a statement accompanying the report, the joint committee on the draft Online Safety Bill’s chair, Damian Collins, (pictured)said: “The Committee were unanimous in their conclusion that we need to call time on the Wild West online. What’s illegal offline should be regulated online. For too long, big tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life.

“The Committee has set out recommendations to bring more offences clearly within the scope of the Online Safety Bill, give Ofcom the power in law to set minimum safety standards for the services they will regulate, and to take enforcement action against companies if they don’t comply.

“The era of self-regulation for big tech has come to an end. The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.”

 

 

Spread the news