New Rules Imposes Duty On Tech Firms To Protect Children From Social Media Harms

New Rules Imposes Duty On Tech Firms To Protect Children From Social Media Harms

By Charlotte Webster-

New rules  are to be introduced for tech firms that allow users to post their own content or interact with the world via social media

Firms failing to protect people  will  face fines of up to ten per cent of turnover or the blocking of their sites, and the  British government will reserve the power for senior managers to be held liable. under the new rules.

Capeesh Restaurant

AD: Capeesh Restaurant

All platforms will also have a duty of care to protect children using their services, but the laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech.

The legislation will define what harmful content will be in scope, tackling illegal activity taking place online in order to prevent children from being exposed to inappropriate material. But the legislation will also address other types of harm that spread online – from dangerous misinformation spreading lies about vaccines to destructive pro-anorexia content. The government’s  approach will empower people to manage their online safety and ensure that these companies will not be able to arbitrarily remove controversial viewpoints.

The full government response to the Online Harms White Paper consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures Alongside the objectives of tackling harmful content, this legislation will protect freedom of expression and uphold media freedom.

Oysterian Sea Food Restaurant And Bar

AD: Oysterian Sea Food Restaurant And Bar

Companies will be required to have accessible and effective complaints mechanisms so that users can object if they feel their content has been removed unfairly. The British  criminal law must also be fit for the digital age and provide the protections that victims deserve.  The Law Commission is currently reviewing whether new offences are necessary to deal with emerging issues such as cyber-flashing and ‘pile-on’ harassment. We will carefully consider using the online harms legislation to bring the Law Commission

Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content.

Law Commission

The Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.  A duty will be imposed on Tech platforms to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.

The most popular social media sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal, but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.

Regulator Ofcom have been empowered with the authority to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. The regulator will also have the power to block non-compliant services from being accessed in the UK.

The legislation includes provisions to impose criminal sanctions on senior managers.  This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as financial services regulation.

Digital Secretary Oliver Dowden said:

”I’m unashamedly pro tech but that can’t mean a tech free-for-all. Today Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation. We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.

This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives.

Home Secretary Priti Patel said:

”We are giving internet users the protection they deserve and are working with companies to tackle some of the abuses happening on the web.

We will not allow child sexual abuse, terrorist material and other harmful content to fester on online platforms. Tech companies must put public safety first or face the consequences.

Dame Melanie Dawes, Ofcom’s Chief Executive, said:

We’re really pleased to take on this new role, which will build on our experience as a media regulator. Being online brings huge benefits, but four in five people have concerns about it. That shows the need for sensible, balanced rules that protect users from serious harm, but also recognise the great things about online, including free expression. We’re gearing up for the task by acquiring new technology and data skills, and we’ll work with Parliament as it finalises the plans.

Richard Pursey, Group CEO & Co-Founder of safety technology company Safe To Net, said:
Online safety is a fundamental human right. That is why we are so proud to support the UK Government, who are leading the way in tackling online harm. The forthcoming legislation marks a pivotal moment for online safety, one that we hope will mean social platforms are made safe by design. This action can’t come soon enough: As our lives continue to become more digital, ourselves and our children are increasingly exposed to online threats. The UK safety-tech industry is leading the way with SafeToNet playing its part to make online harms a thing of the past.

The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people’s rights online and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in technology businesses.

Scope

The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly interact with others online.

It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.

The legislation will include safeguards for freedom of expression and pluralism online – protecting people’s rights to participate in society and engage in robust debate.

Online journalism from news publishers’ websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation to make sure journalistic content is still protected when it is re-shared on social media platforms.

Categorised approach

Companies will have different responsibilities for different categories of content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.

All companies will need to take appropriate steps to address illegal content and activity such as terrorism and child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are not accessing platforms which are not suitable for them.

The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty of care in codes of practice.

A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1.

Heritage And Restaurant Lounge Bar

AD: Heritage And Restaurant Lounge Bar

Spread the news