LEGAL, REGULATORY & COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

HM Coroner vs the Online Safety Bill

The majority of HM Coroner’s recommendations are either already addressed in the Online Safety Bill, the subject of proposed amendments, or would simply conflict with existing data protection and privacy rights rendering them unsuitable to pursue in the absence of a wider review, with the use of advertising being the only real outlier. While algorithmic accountability is identified in the Bill as merely one of the matters to be taken into account when conducting risk assessments and complying with safety duties, we anticipate that this (together with algorithmic transparency) will be an area of early focus for the intended online safety regulator, Ofcom.
— Handley Gill

On 13 October 2022, HM Coroner Andrew Walker, the Coroner to the Molly Russell inquest, issued a report pursuant to Regulation 28 of the Coroners (Investigations) Regulations 2013 to the government, Pinterest, Meta (the operator of Facebook and Instagram), Snap (the operator of Snapchat) and Twitter, detailing his concerns that circumstances creating a risk of further deaths continued to prevail and required action to eliminate or reduce that risk and making recommendations in accordance with Schedule 5 paragraph 7 of the Coroners and Justice Act 2009. The recipients of the report are obliged under Regulation 29 to write to HM Coroner by 08 December 2022 detailing what action has been or is proposed to be taken or explaining why no action is proposed, and HM Coroner must then provide a copy of those responses to the Chief Coroner, any interested persons and any others to whom it may be useful or of interest. This could include Ofcom, for example, which is intended to become the regulator of so-called online harms. The Chief Coroner can determine to publish those responses.

Molly Russell killed herself in November 2017 at the age of 14 in circumstances where - HM Coroner concluded - she had “died from an act of self-harm whilst suffering from depression and the negative effects of on-line content” and that the latter had “contributed to her death in a more than minimal way”.

HM Coroner recommended that:

1.       The Government consider reviewing the provision of internet platforms to children, with reference to:

a.       harmful on-line content;

b.       separate platforms for adults and children;

c.       verification of age before joining the platform;

d.       provision of age specific content;

e.       the use of algorithms to provide content;

f.        the use of advertising;

g.       parental guardian or carer control; and,

h.       access to material viewed by a child and retention of material viewed by a child.

2.       The Government consider establishing an independent regulatory body to monitor on-line platform content.  

3.       The Government consider enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful on-line content and the effective regulation of harmful on-line content.

4.       Social media platforms give consideration to self-regulation addressing the matters highlighted to the Government.

Children and social media

By virtue of the UK GDPR, it is already the case that only those aged 13 or over in the UK can consent to the use of their personal data, necessitating parental consent for anyone under that age (where consent is the lawful basis for processing). Many platforms therefore ban under-13s from using their services in their terms and conditions, although the additional obligations imposed under the Bill on platforms likely to be used by children could see moves to increase the age limit for users. While age verification is not an explicit requirement imposed on platform providers under the Online Safety Bill (as introduced), it will be a necessary compliance tool to meet the varying requirements imposed to protect children (as opposed to adults) from certain types of content, unless platforms choose to take a ‘lowest common denominator’ approach to compliance. This was explicitly recognised by the government when the Secretary of State at the Department for Digital, Culture, Media and Sport, Michelle Donelan MP, stated in an interview with The Times that “the reality is that they [social media companies] are going to have to assure themselves they know the age of their users”. The proposals in the Online Safety Bill therefore already address the Coroner’s recommendations in relation to “separate platforms for adults and children”, “verification of age before joining the platform”, and “provision of age specific content”.

The Online Safety Bill also already contains measures to address harmful content which may be encountered by children on services which are likely to be accessed by them, including to conduct children’s risk assessments (clause 10) and duties to take appropriate and proportionate measures to mitigate and manage the risk of harm to children (clause 11(1)(a)), to mitigate the impact of harm to children using the service (clause 11(2)(b)), to take measures to prevent all children from encountering certain types of content which could be harmful to children and which is identified by the Secretary of State in forthcoming regulations (clause 11(3)(a)), to protect children in certain age groups from other content harmful to children (clause 11(3)(b)), to include appropriate provisions in their terms of service detailing how the duties will be met, to enforce the provisions of the terms of service consistently (clause 11(6)), and to make provision for content to be easily reported (clause 17(4)), in addition to wider obligations.  While the Bill does not stipulate that there should be separate platforms for children and adults, in practice the effect of the provisions will either be to restrict the content that children are able to see through requiring age verification, or to restrict the content that all users are able to see, delivering a child friendly platform.

While it is not yet clear what content could be designated in regulations laid by the Secretary of State, the government has now indicated that it intends to criminalise content encouraging or assisting self-harm, and this would therefore fall within the illegal content duties applicable to services whether they are used by children or merely adults. An  amendment to this effect (NC16) had been tabled by a cross-party group of MPs led by David Davis MP to insert a new provision into the Suicide Act 1961 creating a new communications offence. As we anticipated, the government has not itself tabled an amendment to achieve the announced change, and it therefore appears that it will accept the cross-party amendment.

These measures would therefore address HM Coroner’s recommendations in relation to “harmful on-line content”, “separate platforms for adults and children”, and “the protection of children from the effects of harmful on-line content and the effective regulation of harmful on-line content”.

Algorithmic accountability

The obligation to conduct a children’s risk assessment and the safety duties protecting children require all aspects of a service to be considered, including algorithms (clauses 10(6)(b)) and 11(4)(b)). To date, no further provisions addressing algorithms have been introduced either at Committee stage or during the ongoing Report stage of the Bill in the House of Commons.

In its Roadmap to Regulation, intended to support affected organisations to prepare to comply with the Bill, Ofcom highlighted that it would “want to know how firms respond to risks of harm and consider any trade-offs with other objectives such as user engagement. In our view, these issues should be discussed regularly by senior decision makers, and owned at the most senior levels”. And in an interview with the FT, Ofcom’s Chief Executive, Dame Melanie Dawes, indicated that Ofcom intended to examine algorithms that “amplify emotional reaction” to news and potentially lead users “into an echo chamber” and that this could include “requiring more transparency or giving the regulator the ability to shine a light [on] how these feeds and algorithms work”.

Advertising

As introduced, the Online Safety Bill only sought to tackle fraudulent advertising (Chapter 5). “the risk of illegal financial promotions delivered via user-generated content or paid-for fraudulent advertising on inscope online services” were highlighted by Ofcom in its Roadmap to Regulation as a priority for it and its fellow regulators the Information Commissioner, the Competition and Markets Authority and the Financial Conduct Authority, which form the Digital Regulation Cooperation Forum (DRCF).

While Ofcom already has a duty, under s.11 Communications Act 2003, to promote media literacy, an amendment (NC37) has been tabled to the Bill by the Liberal Democrats to expand the scope of that duty in so far as relates to services covered by the Online Safety Bill to include the development and deployment of technologies to signify adverts.

Parental control

The Bill grants powers to “affected persons” to take steps, such as complaining about content, and this includes the parent or other adult with responsibility for the care of a child user (clause 17(6)(c)). However, clause 19(3) requires that all services have regard to data protection and privacy obligations when choosing and implementing their policies and processes and, since child users aged 13 and over do not require parental consent in accordance with Article 8(1) UK GDPR, these provisions would conflict with any requirement to grant parents or guardians oversight or other control over the accounts of their children. As recently as 25 November 2022, the Information Commissioner and Ofcom announced how they would work to “achieve maximum alignment and consistency between the data protection and online safety regimes” and, recognising the inherent tension in some aspects between the competing obligations, asserted that “Where there are tensions between privacy and safety objectives, we will provide clarity on how compliance can be achieved with both regimes”.

Retention of, and access to, data

The Bill does not make provision for records to be retained regarding the material viewed by any user, and no amendments have been tabled in the Commons to date to secure the introduction of such provisions. As in relation to parental control, such measures would conflict with the data protection rights and, potentially, reasonable expectations of privacy of child users where data was requested by third parties, even parents, guardians or other carers. The Information Commisioner’s guidance makes clear that “You should therefore only allow parents to exercise these rights on behalf of a child if the child authorises them to do so, when the child does not have sufficient understanding to exercise the rights him or herself, or when it is evident that this is in the best interests of the child. This applies in all circumstances, including in an online context where the original consent for processing was given by the person with parental responsibility rather than the child”. Young people are already entitled under data protection legislation to request access to copies of their own personal data, which would include any records retained of material they had viewed. In practice, extensive records are usually retained by platforms for advertising and content recommendation purposes.

As relates to obtaining access to data in the context of a criminal investigation, the October 2019 ‘Agreement between the Government of the United Kingdom of Great Britain and Northern Ireland and the Government of the United States of America on Access to Electronic Data for the Purpose of Countering Serious Crime’ (‘Data Access Agreement’) finally came into force on 03 October 2022, although this would only provide assistance in some cases.

Regulator

Clause 77 of the Bill would expand Ofcom’s duties as set out under s.3 Communications Act 2003 to include “the adequate protection of citizens from harm presented by content on regulated services, through the appropriate use by providers of such services of systems and processes designed to reduce the risk of such harm”, thereby establishing it as the independent regulator of content on online platforms.

Self-regulation

HM Coroner recommended that platforms look to establish self-regulation. Platforms would no doubt respond that, by deploying content moderation procedures and stipulating their terms and conditions which can lead to enforcement action such as denying amplification and ultimately banning accounts, they already engage in self-regulation. Some platforms, particularly Meta, has already gone further, establishing the Oversight Board to review and make recommendations as to policies as well as individual content decisions. Furthermore, in circumstances where the government has effectively rejected self-regulation of online platforms, and the prospective regulator has issued guidance indicating that it expects affected organisations to begin preparing for compliance, it would be an expensive and unnecessary distraction to invest time and resources in seeking to develop a parallel system of regulation. That is not to say that there isn’t room for platforms to work together to develop codes of practice, systems and even technologies to meet the forthcoming obligations.

Find out more about our Online Safety, Online Harms, Content Moderation and Content Regulation services.