Ofcom Tells Tech firms To Prepare For Regulation Geared Towards Safer Online Platforms

Ofcom Tells Tech firms To Prepare For Regulation Geared Towards Safer Online Platforms

By Charlie CarMichael-

Ofcom has  today said it will focus on getting the ‘first phase’ of the new regulation up and running – protecting users from illegal content harms, including child sexual exploitation and abuse, and terrorist content.

It will include a draft Code of Practice on illegal content harms explaining how services can comply with their duties to tackle them; and draft guidance on how we expect services to assess the risk of individuals coming across illegal content on their services and associated harms.

Ofcom added that it will also publish a sector-wide risk assessment which will include risk profiles for different kinds of services that fall in scope of the regime.

Companies will have  three months to complete their risk assessments related to illegal content, and be ready to comply with their duties in this area from mid-2024 once the Code of Practice has been laid in Parliament.

Companies which run these sites or apps must be ready  as soon as the regulators first set of powers come into force in early 2023 – to explain their existing safety systems to us and, importantly, how they plan to develop them.

Ofcom  said it will expect companies to be open  about the risks they face and the steps they are taking to address them. We will want to know how they have evaluated those measures, and what more they might consider doing to keep users safe. We will also seek to understand users’ attitudes to those services, and consider evidence from civil-society organisations, researchers and expert bodies.

Platforms that fail to take appropriate steps to protect users from significant harm will be forced to do so or subjected to punitive measures.

Content that is harmful to children, and priority content that is legal but harmful to adults could be in social media like instagram , facebook , youtube, including through third parties that may share the network

Ofcom said it would like to hear from companies that are likely to fall within the scope of the regime, as well as other groups and organisations with expertise in this area.

Video Sharing Platforms

Ofcom  said it  will publish its first report on how video-sharing platforms such as TikTok, Snapchat, Twitch and OnlyFans are working to tackle harm; and undertaking and publishing research on the drivers and prevalence of some of the most serious online harms in scope of the

The focus of the Bill is not on Ofcom moderating individual pieces of content, but on the tech companies assessing risks of harm to their users and putting in place systems and processes to keep them safer online.

Ofcom  said it will have powers to demand information from tech companies on how they deal with harms, and to take enforcement action when they fail to comply with their duties. The Bill will also ensure the tech companies are more transparent and can be held to account for their actions.

The regulator made it clear that it would not censor online content because the Bill does not give Ofcom powers to moderate or respond to individuals’ complaints about individual pieces of content. The Government recognises that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, we will tackle the causes by ensuring companies design their services with safety in mind from the start.

Ofcom will examine whether companies are doing enough to protect their users from illegal content and content that is harmful to children, while recognising that no service in which users freely communicate and share content can be entirely risk-free. Under the draft laws, the duties placed on in-scope online services are limited by what is proportionate and technically feasible.

Interestingly, Ofcom said companies can host content that is legal but harmful to adults, provided it has clear service terms. Under the Bill, services with the highest reach – known as ‘Category 1 services’ – must assess risks associated with certain types of legal content that may be harmful to adults.

This plan is based on our current understanding of the Bill as it stands, and the likely timing for passage of legislation (including secondary legislation) under the Bill. At the time of publication, the Bill has passed Committee stage in the House of Commons and is subject to amendment as it passes through the rest of the Parliamentary process.

All services in scope of the Bill have a duty to protect users from illegal content. They must assess, among other things, the risk of individuals coming across illegal content on their platforms, and how that risk is affected by the design of their service. Tech firms must also establish whether children, in significant numbers, can access any part of their service. Companies must put in place measures to mitigate and manage the risks of illegal content and, if they’re likely to be accessed by children, material which is harmful to children, as well as allowing their users to report content and complain.

Ofcom said it will consult publicly and expect to finalise them within a year, at which point firms should expect to be ready to comply with these duties. We expect to set out draft Codes of Practice and risks guidance on protecting adults from legal harms in early 2024.

 

 

Spread the news