LEGAL, REGULATORY & COMPLIANCE CONSULTANTS

Handley Gill Limited

Our expert consultants at Handley Gill share their knowledge and advice on emerging data protection, privacy, content regulation, reputation management, cyber security, and information access issues in our blog.

Washed-up or fallen down the plughole?

While a number of significant Conservative commitments were forced to be dropped as a consequence of the calling of the snap General Election on 22 May 2024, giving Parliament just a few days to reach agreement in the ‘wash up’ prior to prorogation on 24 May, several measures or reports affecting digital markets, online safety, artificial intelligence (AI) and human rights were passed or published prior to Parliament’s dissolution. We anticipate that a number of the remaining provisions will either become a focus of the main political parties’ manifesto pledges or will find favour in the new Parliamentary session.
— Handley Gill Limited

Following the announcement of the snap General Election to take place on 04 July 2024, Parliament was prorogued with effect from 24 May 2024 (meaning Parliamentary business was suspended thereafter) and dissolved with effect from 30 May 2024.

The brief period between the announcement of the election on 22 May and prorogation is known as ‘wash up’, when political parties must negotiate to pass outstanding Bills, or parts of them, or Bills fall. Prorogation also bring an end to the work of the various Parliamentary Committees. We consider which Bills have been washed up and which have fallen in the cyber security, data protection, online safety, artificial intelligence (AI), digital markets, content regulation, reputation management, open justice, access to information, human rights and ESG, as well as the work of Parliamentary Committees which were either rushed out or dropped.

WASHED UP

Media Act 2024

Digital Markets, Competition and Consumers Act 2024

Victims and Prisoners Act 2024

Joint Committee on Human Rights inquiry into ‘Human Rights and the proposal for a “Hillsborough Law”’

Education Committee inquiry into ‘Screentime: impact on education and wellbeing’

Commons Science, Innovation & Technology Committee inquiry into ‘Governance of artificial intelligence’

FALLEN

Data Protection and Digital Information Bill

Strategic Litigation Against Public Participation Bill

Criminal Justice Bill

Artificial Intelligence Regulation Bill

Commons Science, Innovation & Technology Committee inquiry into ‘Cyber resilience of the UK’s critical national infrastructure’

Commons Women & Equalities Committee inquiry into ‘non-consensual intimate image abuse’

European Affairs Committee inquiry into ‘UK-EU data adequacy’

Washed Up

Media Act 2024

The House of Lords completed its examination of the Media Bill on 23 May 2024, and Royal Assent was granted the following day enacting the Media Act 2024.

The Media Act addresses a range of issues, including by: beginning to rationalise the regulation of linear and video on demand (VOD) services; extending the application of principles applicable to linear broadcasting services to on demand and internet services; changing the nature of Channel 4 from a commissioning and broadcasting entity to a programme maker in its own right; extending Ofcom’s powers in relation to UK based on demand programme services to certain services based outside the UK; addressing the regulation of radio and radio selection services; and, removing the threat of one way costs shifting from news publishers which are not members of an approved regulator.

The Act relaxes the obligations on existing public service broadcasters to fulfil their obligations by enabling output made available on video on demand services to be taken into account in addition to linear output provided it is made available free of charge (notwithstanding the television licence fee) and made available for at least 30 days (other than in relation to sports coverage), which is achieved by section 1 amending parts of section 264 Communications Act 2003.  The Act also introduces an obligation on Ofcom to ensure sufficient content is in, or mainly in, a recognised regional or minority language. However, the Act expands the scope of service capable of being designated a relevant service from the traditional linear broadcasters of BBC services, Channels 3, 4 and 5, Welsh Authority services and teletext to include digital services, on demand programme services non-UK on demand programme services and other services  or parts of services, whose principal purpose is the provision of programmes over which there is general control over curation provided over the internet to people in the UK.

The Act removes the programming quota of 25% for independent productions contained with section 277 Communications Act 2003 and instead makes provision for the Secretary of State to make regulations in this regard (section 8 Media Act 2024), potentially also enabling the calculation of the fulfilment of quotas to be altered.  The Act grants the Secretary of State the power to make regulations imposing additional quotas on qualifying audiovisual content.

In relation to Channel 4, the quotas imposed to include schools programmes under section 296 Communications Act 2003 are abolished by section 16 of the Media Act. The Act amends the Broadcasting Act 1990, which established Channel 4, to include a new obligation on the Channel 4 Television Corporation to act in a manner which promotes its long term financial sustainability. Section 31 Media Act 2024 removes the restriction included at section 295 Communications Act 2023 which prevents Channel 4 from being involved in the making of programmes for its own channel.

The Act also extends the listed events regime, for sporting or other events of national interest, under Part IV Broadcasting Act 1996 which restricts the exclusive grant of rights to any one television programme provider to reception or internet access (sections 21-23 Media Act 2024).

Ofcom’s investigation and enforcement powers are expanded by sections 18-19 and 24 Media Act 2024, including the power to make information notices.

The obligations on Ofcom to secure a public teletext service under sections 218 – 223 Communications Act 2023 are abolished by section 26 Media Act 2024.

Obligations of prominence in relation to television selector services are extended by the Act to include designated internet programme services (see new section 362AJ Communications Act 2023) and to require them to be carried (see new section 362AK Communications Act 2023), and Ofcom is required to issue a related code of practice. Provision is also made for the resolution of disputes.

At present the provisions of Part 4A Communications Act 2023 which governs the regulation of on demand programme services is restricted to services whose head office is in the UK. This excludes services such as Netflix from its remit as it is headquartered in the Netherlands. Part 4 Media Act 2024 amends the Communications Act 2023, however, to provide for designated non-UK Tier 1 services to be regulated in the same way as those based in the UK. New obligations are also imposed on Ofcom, including to conduct a review of audience protection measures adopted by on demand programme services and designated Tier 1 services in respect of harmful content, through mechanisms such as age rating or other classification systems, content warnings, parental controls and/or age assurance measures (section 38 Media Act 2024).

Part 5 of the Act addresses the regulation of radio services including the introduction of powers for the Secretary of State to mandate local news and information to be included in local digital radio services. Part 6 governs radio selection services.

Importantly for newspapers and other relevant publishers, section 40 Crime and Courts Act 2013, which had not yet been brought into force but would have obliged publishers to pay the costs of certain legal claims brought against them even if they were successful if they were not members of a regulator approved by the Press Recognition Panel, has been omitted by section 50 Media Act 2024., which will come into force on 24 July 2024. The only approved regulator was Impress; the Independent Press Standards Organisation (IPSO) of which many national newspapers are members, is not recognised, and there are a number of significant national publishers including The Guardian and The Financial Times, which have eschewed any form of external regulation.

Digital Markets, Competition & Consumers Act 2024

The Digital Markets, Competition and Consumers had already reached the ‘ping pong’ stage of its passage when the general election was called and with some final amendments by members of the Commons, the Bill was passed and the Bill received Royal Assent on 24 May 2024, becoming the Digital Markets, Competition and Consumers Act 2024.

The Act grants wider powers to the Competition and Markets Authority (CMA), including in relation to companies outside the UK which are carrying out business in relation to digital activities in the UK or which have a significant number of UK users and which it designates as having strategic market status as a consequence of the CMA estimating that it has UK turnover in excess of £1 billion or global turnover in excess of £25 billion, a position of strategic significance and substantial and entrenched market power (Part 1 Chapter 2 DMCCA 2024). Those entities subject to this new digital markets regime can be subjected to specific permitted conduct requirements by the CMA (section 19 DMCCA 2024), including to have effective processes for handling complaints by and disputes with users or potential users and to present to users or potential users any options or default settings in relation to the relevant digital activity in a way that allows those users or potential users to make informed and effective decisions in their own best interests about those options or settings. Measures directed to preventing a designated undertaking from using data unfairly are also permitted.

As such, there are clear overlaps between the powers of the CMA and Ofcom in implementing and enforcing its powers under the Online Safety Act 2023 and the Information Commissioner’s Office under the UK GDPR, Data Protection Act 2018 and PECR. Indeed, there are provision in the Act obliging the CMA to consult with the ICO or Ofcom if a proposal to exercise a relevant regulatory digital markets function where there are concurrent functions (in relation to Ofcom) or there is likely to be a material adverse effect on the ability of the ICO to exercise its functions (section 107 DMCCA 2024).  

The CMA can also make pro-competition interventions (PCIs) in relation to designated undertakings to address adverse effects on competition relating to digital activities (section 46 DMCCA 2024).

The obligations on affected entities are potentially broader than under other regimes, as the Act includes a duty to preserve relevant information when it is known or suspected that a breach investigation or PCI is being or is likely to be carried out (section 80 DMCC 2024).  

Affected entities must nominate an officer with responsibility for securing and monitoring compliance and co-operating with the CMA (section 83 DMCCA 2024), and submit compliance reports to the CMA (section 84 DMCCA 2024).

The failure to comply with a conduct requirement can result in penalties on a fixed, daily or combined basis with fixed penalties of up to 10% of the total turnover of the undertaking or group, and daily penalties of 5% of the total daily turnover of the undertaking or group (section 86 DMCCA 2024). Penalties are also available for breaching investigative requirements, and offences are created including of giving false or misleading information.

The CMA is afforded absolute privilege for the purposes of the law of defamation in relation to anything done in the exercise of its functions (section 112 DMCCA 2024). The Act also extends the protected disclosure regime under the Public Interest Disclosure (Prescribed Persons) Order 2014 (S.I.2014/2418) to compliance with the digital markets regime.

On the day the Bill received Royal Assent, the CMA immediately published a consultation on its draft guidance for the digital markets competition regime.

The Act includes the so-called ‘Telegraph Takeover’ provisions, establishing at section 130 DMCCA 2024 and Schedule 7 powers to prevent foreign powers from gaining control or influence over newspaper enterprises. This provides for the Secretary of State to issue a foreign state intervention notice where a merger with a newspaper enterprise would be created which would involve a foreign power being able to control or influence the policy of – or increase their influence over the policy of - the person carrying on the newspaper enterprise, which obliges the CMA to prepare a report on whether a foreign state newspaper merger situation has been created or is in progress or contemplation. Where the CMA reports that a foreign state newspaper merger situation has been created or is in progress or contemplation, the Secretary of State must make an order containing measures considered reasonable and practicable to reverse or prevent the creation of the foreign state newspaper merger situation.

The Act expands the CMA’s wider powers and those of the courts, including in connection with infringements of consumer protection laws which harm the collective interests of consumers (Part 3 DMCCA 2024).

Courts are granted power to make consumer protection orders on the application of relevant enforcement bodies, including interim or final online interface orders, requiring the removal or modification of content on an online interface, the disabling or restriction of access to an online interface, the display of a warning to consumers accessing an online interface and even to delete a domain name and transfer its registration to the public designated enforcer (sections 161 – 162 DMCCA 2024).  Provision is also made for the giving and enforcement of undertakings. The Secretary of State is permitted to designate private enforcers who are entitled to apply to the courts for an enforcement order provided that neither they nor an associated undertaking would directly benefit (section 177 DMCCA 2024).

The Act prohibits unfair commercial practices in relation to consumers (section 225 DMCCA 2024), identified as misleading actions or omissions, aggressive practices, contravention of requirements of professional diligence, the omission of material information from an invitation to purchase and/or a range of practices such as falsely claiming to be a signatory to a code of conduct, displaying a trust or quality mark without authorisation, falsely claiming a practice or product has been approved, endorsed or authorised when it has not or the terms of any such approval, endorsement or authorisation are not being complied with, and promoting products in the media through editorial content which is not clearly identified and publishing fake or concealed incentivised consumer reviews, making persistent unwanted solicitations and, making direct appeals to children to buy or persuade their parents or other adults to buy advertised products.

The Act grants consumers rights of redress in certain circumstances and makes provision for the Secretary of State to make regulations granting consumers further rights of redress to unwind contracts or to discounts or damages, and establishes the right to bring a civil claim for the purposes of enforcement.

Offences are also created in relation to engaging in unfair commercial practices (section 237 DMCCA 2024), and for some offences where the conduct arose as a result of a mistake or accident or a third party, a defence is established of due diligence and innocent publication (section 238 DMCCA 2024).

The Act also regulates subscription contracts and dictates the information and reminders required to be provided to consumers (Part 4 Chapter 2 DMCCA 2024). Part 4 Chapter 3 governs traders’ consumer savings schemes and implements obligations to establish trust or insurance arrangements in the event of insolvency. Part 4 Chapter 4 establishes mechanisms for ADR in relation to consumer contract disputes by accredited or exempt ADR providers.

Section 331 of the Act provides that while the Act does not require or authorise the processing of personal data in breach of the data protection legislation, the existence of the duty or power should be taken into account in determining the lawfulness of processing, i.e. determining whether Article 6(1)(c) UK GDPR applies in connection with processing necessary for compliance with a legal obligation to which the controller is subject.

Victims and Prisoners Act 2024

Tucked away at section 31 of the Victims and Prisoners Act 2024, which introduces a victims’ code and other support for victims including the appointment of a standing advocate for victims of major incidents, is an amendment to Article 17 UK GDPR which, when brought into force, will introduce an additional ground granting the data subject the right to the erasure of their personal data under the so-called right to be forgotten. The new ground will be exercisable where the individual was the subject of an allegation made by a third party who had either been convicted of one of various stalking and harassment offences or made the subject of a stalking protection order and the data controller has investigated the allegation and determined that no further action is appropriate. 

Joint Committee on Human Rights Inquiry into Human Roghts & the Proposal for a Hillsborough Law

While recognising that the Criminal Justice Bill, that subsequently fell, would have introduced a duty of candour on the police, following the conclusion of their inquiry the Committee warned that public authorities’ “institutional defensiveness” resulting in a focus on the protection of reputation and avoidance of liability remained a risk to the effectiveness of public investigations and inquiries and therefore recommended the introduction of a statutory duty of candour backed by criminal sanctions. The Committee also called for the establishment of a statutory guarantee of proportionate funding for participants in inquest and inquiries to support compliance with obligations under Articles 2 and 3 ECHR.

In government there were several occasions when the Conservative party flirted with the idea of withdrawing from the jurisdiction of the European Court of Human Rights, which would necessitate withdrawal from the Council of Europe. While the Bill of Rights Bill, which was intended to reform the UK’s human rights law regime, was abandoned in 2023, the UK’s ongoing commitment to the European Convention on Human Rights could prove to be an electoral battleground.

Education Committee Inquiry into 'Screentime: Impact on Education and Wellbeing'

The Education Committee noted that during the course of its inquiry  into the impact of screentime on the education and wellbeing of young people it had “heard no evidence to suggest that 13 is an appropriate age for children to understand the implications of allowing platforms access to their personal data online”, and called on the next government to “consult on raising the age of digital consent” and suggested that it “should recommend 16 as a more appropriate age”. The so-called digital age of consent is currently set by Article 8 UK GDPR at 13, and was an issue on which the GDPR allowed member states to take their own decision on the age between 13-16 at which it should be set.

The Committee also called for stronger enforcement of the digital age of consent. Article 8(2) UK GDPR requires that data controllers, such as online platforms, “make reasonable efforts to verify in such cases that consent is given”. These obligations are supplemented by the standards set out in the ICO’s Children’s Code aka the Age Appropriate Design Code. The Committee suggested that the age assurance obligations imposed as part of the safety duties on relevant user to user and search services under the Online Safety Act 2023 should be extended to the verification of the digital age of consent.

Following calls by the mother of murdered schoolgirl Brianna Ghey, the Committee called on the next government to “work alongside Ofcom to consult on additional measures regarding smartphones for children under 16 years old, including the possibility of a total ban of smartphones (internet-enabled phones) for children under 16 or parental controls installed as default on phones for under 16s”.

Commons Science, Innovation & Technology Committee Inquiry into 'Governance of Artificial Intelligence (AI)'

In its report on ‘Governance of artificial intelligence (AI)’ published on 28 May 2024, the Committee called on the new government to “stand ready to introduce new AI-specific legislation”, to identify the triggers that will justify the introduction of legislation and to “commit to laying before Parliament quarterly reviews of the efficacy of its current approach to AI regulation”. In particular, the Committee called for a domestic framework that sufficiently addressed its “Twelve Challenges of AI Governance”: bias; privacy; misrepresentation; access to data; access to compute; black box; open-source; intellectual property (IP) and copyright; liability; employment; international co-ordination; and, existential.  

The report called on regulators to focus on “here and now impacts” rather than existential risks, which was suggested should be the domain of the AI Safety institute and the UK’s national security apparatus with any change in the assessment of the likelihood or impact in such risk being addressed via international fora. While calling for regulators to strike an appropriate balance between privacy rights and the public interest in the benefits of specific AI models, the report also called for sanctions up to prohibitions on those models which fail to meet legal obligations in this regard.

The report calls for AI developers and deployers to be accountable through obligations of routine disclosure of steps taken to account for bias in datasets and outputs. The Committee suggested that this should be accompanied by the publication by the government of guidance on liabilities for AI harms and legal infringements by developers, intermediaries and deployers.

While encouraging continued international liaison, the Committee rejected “a global AI governance regime”, arguing that this was neither realistic nor desirable.

The report recognised that the additional £10m in funding announced by the government for regulators would be insufficient to meet the challenges of AI, and the deep pockets of AI developers, and therefore called for additional funding for regulators as well as law enforcement to “respond to the growing use of AI models and tools to generate and disseminate harmful and illegal content”. The Committee also called for the government to “conduct and publish the results its regulatory gap analysis as soon as is practicable”.

The Committee called for the expansion of the Algorithmic Transparency Recording Standard to all public bodies sponsored by Government departments with effect from 01 January 2025.

The Committee raised concern that, notwithstanding the agreement reached at the AI Safety Summit at Bletchley between world leaders and Amazon Web Services, Anthropic, Google, Google DeepMind, Inflection AI, Meta, Microsoft, Mistral AI, Open AI and xAI to collaborate on testing the next generation of Artificial Intelligence (AI) models against a range of critical national security, safety and societal risks, some developers had failed to grant access to AI models to the AI Safety Institute and called on the government to confirm which AI models had been subject too pre-deployment safety testing by the AI Safety Institute and which developers had refused access.

The report called on regulators to test not only model outputs but also to understand how those outputs were arrived at.

The Committee called on the CMA to tackle abuses of market power and, in order to tackle the challenge of access to high quality training data in sufficient volumes suggested that the government should establish a National Data Bank of anonymized public data gathered from organisations including the NHS and the BBC. The Committee suggested that a financial settlement by AI developers for prior infringements of IP and copyright and a licensing framework to govern future use.

The Committee welcomed the government’s amendment to the Criminal Justice Bill to introduce a new offence of maliciously making a sexually explicit ‘deepfake’ image without consent.

The Committee called on the government to address the threat posed by misinformation, particularly in the context of the General Election, and for enforcement action to be taken against online platforms which fail in their duties including to remove and prevent the spread of deep fake content which seeks to exert a malign influence on the democratic process.

Fallen

Data Protection & Digital Information Bill

While the government’s third attempt to reform the UK’s data protection legislation post-Brexit had progressed through the Commons and the House of Lords Committee Stage and was shortly due to be considered at Report Stage in the Lords, the Bill appears to have been considered to be too controversial to reach political agreement with significant numbers of outstanding amendments.

As introduced, the Bill would not have consolidated but would have amended the UK’s data protection legislation, which is currently found in the UK GDPR, Data Protection Act 2018 and Privacy and Electronic Communications Regulations. The Bill would have implemented a range of measures, including: fundamentally restructuring the Information Commissioner’s Office and granting new enforcement powers; abolishing the role of the Biometrics and Surveillance Camera Commissioner; narrowing the scope of the definition of what constitutes personal data; establishing statutory legitimate interests; clarifying and for some purposes relaxing the requirements of compatibility for further processing of personal data; reducing the threshold for controllers to reject data subject access requests on the grounds of being vexatious; relaxing the protections against solely automated processing producing legal or similarly significant effects; minimising burdens on data controllers including by removing the obligation for Data Protection Officers (DPOs) and for organisations established outside the UK to appoint a representative in the UK; removing the obligation to consult with the Information Commissioner’s Office in respect of high risk data processing activities; reducing the threshold for the Secretary of State to make regulations designating a third country or international organisation’s protections for personal data as being adequate; introducing the ability for codes of conduct to be taken into account in determining compliance of competent authorities such as police forces with their data protection obligations; and, to amend the rules which govern the need for cookie banners.

Further amendments were made to the Bill during its passage, including: to permit international law to be relied on as a lawful basis for processing personal data; to relax compliance obligations on political parties, elected representatives and election candidates; restricting the scope of searches required to be carried out in response to data subject access requests.

The government had also proposed significant amendments to the Bill: to introduce powers for the Secretary of State to obtain information for social security purposes (Gov_NC34); to amend the Online Safety Act 2023 to oblige Ofcom to require relevant providers to retain information relating to deceased children where requested by a coroner (Gov_NC35); and, to provide for indefinite retention of pseudonymised biometric data by law enforcement (Gov_NC37).

It is the powers in relation to social security information, termed a ‘Snooper’s Charter’ in relation to banking data, that are believed to have prevented agreement being reached on the Bill during wash-up.

While perhaps unlikely to sway many votes, other than those of DPOs and UK representatives, we do think that there is scope for more considered reform of data protection law, and would welcome the consolidation of existing legislation.

Although data protection reform has not yet crept into the parties’ headline policies, the amendment aimed at extending of the Online Safety Act 2023 to grant Ofcom powers in relation to deceased’s children’s data on the request of a Coroner has been adopted by Labour’s Shadow Home Secretary Yvette Cooper and Health Secretary Victoria Atkins affirmed her ongoing support for the policy, but declined to pre-empt the Conservative Party’s manifesto in an interview with the BBC’s Laura Kuenssberg.

Artificial Intelligence (Regulation) Bill

Conservative Peer Lord Holmes of Richmond’s Artificial Intelligence (Regulation) Bill had passed Third Reading in the House of Lords in the form it was introduced and awaited a Commons’ sponsor for it to pass over for consideration. We identified the central planks of the Bill – and what it didn’t cover – in a previous blog. Perhaps it is unsurprising given the government’s fiercely “pro-innovation approach” to artificial intelligence (AI) and reluctance to introduce new legislation to govern AI, unless and until it no longer had confidence that “voluntary measures would be implemented effectively by all relevant parties and if we assessed that risks could not be effectively mitigated using existing legal powers” and that “we could mandate measures in a way that would significantly mitigate risk without unduly dampening innovation and competition”, that the Bill did not find favour during the wash-up.

At its 2023 Party Conference, Labour Party delegates considered the motion ‘Technology and AI in the workplace’ put forward by the unions Unite and the CWU, and passed the proposal to “develop a comprehensive package of legislative, regulatory and workplace protections to ensure that when in government the positive potential of technology is realised for all including the fair distribution of productivity gains” , resolving that a future Labour government should “ensure that a legal duty on employers to consult trade unions on the introduction of invasive automated or artificial intelligence technologies in the workplace is enshrined in law” and to “commit to working with trade unions to gain an understanding of the unscrupulous use of technology in the workplace and campaign against it”.

The regulation of AI could prove to be an area of debate.

Strategic Litigation Against Public Participation Bill

A private members Bill introduced by Labour MP Wayne David in February 2024, the Bill had - perhaps unusually - obtained Government support and had cleared Commons Committee stage and was due to consideration at Report stage but had not yet reached the House of Lords.

While the government had enacted sections 194-195 Economic Crime and Corporate Transparency Act 2023 to require the Civil Procedure Rules to be amended to create a power to strike out claims before trial which could be categorized as SLAPPs, which were defined and restricted to allegations of economic crime, and where the claimant had failed to show that it was more likely than not to succeed at trial, the efficacy of which had been the subject of a previous post, the Strategic Litigation Against Public Participation Bill would have enacted a broader definition of a SLAPP (Strategic Lawsuit Against Public Participation or Strategic Litigation Against Public Participation) to encapsulate any publication on a matter of public interest, which included but was not limited to unlawful behaviour, false statements, public health and safety, the climate or the environment and public investigations or reviews. The Bill would have similarly required that the Civil Procedure Rules secure the power of the court to strike out such claims before trial.

The Bill did not introduce additional costs penalties for SLAPPs or address the conduct of individuals and their legal representatives prior to litigation being commenced in so called pre-publication correspondence. By way of example, Baroness Michelle Mone admitted to lying to the media through her solicitors in relation to her connections to the company PPE Medpro, which secured hundreds of millions of pounds of contracts to provide protective equipment during the COVID pandemic having been introduced to the government by Mone, and threatening that any publication linking her to the company would be defamatory. The company is now being sued by the government for breach of contract and unjust enrichment, which the company denies. In an interview with the BBC she subsequently admitted that “we've only done one thing, which was lie to the press to say we weren't involved” but complained that this was ”not a crime” and accepted that she would benefit personally from the company’s government contracts. The Solicitors Regulation Authority (SRA) has published an update to its 2022 warning notice to solicitors on SLAPPs, which expands upon key areas of concern including making claims or assertions without merit, noting that it had received 70 complaints of SLAPPs and referred 2 instances to the Solicitors Disciplinary Tribunal.

Given the political imperative of courting newspaper editors, we anticipate that measures to tackle SLAPPs will be included in both major parties’ election manifesto pledges.

Criminal Justice Bill

While primarily focused on police powers and duties, sentencing and the management of offenders, and the creation of certain new offences relating to nuisance begging, nuisance rough sleeping and anti-social behaviour and, restrictions on sex offenders,  the Criminal Justice Bill, which had only reached the Report Stage in the House of Commons and had not been considered by the House of Lords, also included measures to address revenge porn and restrictions on protests.

Schedule 2 of the Bill also proposed to introduce a series of new offences under the Sexual Offences Act 2023 related to the taking of intimate photographs or film without consent and the installation of equipment to enable the taking of such photographs or film. The government also introduced an amendment to create a new offence of the creation of the purported sexual image of an adult with the intention of causing the subject alarm, distress or humiliation or for the purpose of the sexual gratification of the creator or another (Gov_NC86).  This would expand the offence enacted by section 188 Online Safety Act 2023 which prohibited the sharing – or threat to share – intimate images without consent.

Amendments introduced by the government included creating an offence of concealing identity at protests in designated localities and other protest related offences, a new offence of climbing on war memorials (Gov_NC100). 

Other measures included clause 11, which would have created an offence of encouraging or assisting serious self-harm, expanding the offence beyond the scope of the related communications offence enacted by section 184 Online Safety Act 2023, to fully implement the Law Commission’s recommendation and section 184 OSA 2023 would have been repealed.

Clause 16 of the Bill would have extended criminal liability to entities in circumstances where their senior managers had committed an offence, which would have served to address the lacuna where senior managers of global entities based outside the UK would not be impacted by the threat of fines or imprisonment.

Other amendments proposed to the Bill included an obligation on police officers to hand over personal mobile phones to appropriate authorities (NC6), the establishment of an Office for Whistleblowers in connection with the reporting of serious crime (NC113), and a prohibition on conversion practices in relation to sexual orientation or transgender identity (NC117).

Certain amendments, in particular those related to revenge porn, had cross-party support and we might therefore expect that these could find their way into the major parties’ manifesto pledges.

European Affairs Committee Inquiry into 'UK-EU Data Adequacy'

The House of Lords European Affairs Committee had extended the deadline for submitting evidence to its inquiry into UK-EU data adequacy, with the result that the new deadline fell the day after the dissolution of Parliament.

Lord Clement-Jones had proposed an amendment to the Data Protection and Digital Information Bill requiring the Secretary of State to carry out an assessment of the impact of the Bill and wider legal and regulatory changes to the European Commission’s adequacy decisions in respect of the UK.

The European Commission’s adequacy decision in respect of the UK  was explicitly and uniquely subject to a sunset clause and stated to be based on the lack of divergence from the GDPR and “adherence to the European Convention of Human Rights and submission to the jurisdiction of the European Court of Human Rights”. While recent adequacy decisions and their reviews have demonstrated that the conferring of adequacy under the GDPR is an act of political will, the fall of the Data Protection and Digital Information Bill, and the immediate threat of the UK’s withdrawal from the European Convention on Human Rights, should serve to allay – at least temporarily – concern at the prospect of the European Commission predicating the withdrawal or failure to renew its adequacy decision under the GDPR in respect of the protections afforded by UK law for personal data on these significant changes.

However, the post-Brexit shift from the EU Charter of Fundamental Rights secured through The Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 and the scope to interpret The Safety of Rwanda (Asylum and Immigration) Act 2024  as undermining UK’s commitment to the rule of law could support a challenge.

Commons Women & Equalities Committee Inquiry into 'Non-Consensual Intimate Image Abuse'

Despite having concluded their inquiry into ‘non-consensual intimate image abuse’, or so-called ‘revenge porn’, the prorogation of Parliament prevented the report from being published and consequently the Chair of the Commons Women & Equalities Committee wrote a letter to Cabinet Ministers and Ofcom calling for non-consensual intimate images (NCII) to be treated with “the same severity as child abuse material”, including measures requiring online platforms to promptly take down such content and for ISPs to block it. The Rt Hon Caroline Nokes also called on the Ministry of Justice to expand the scope of the Criminal Injuries Compensation Scheme to “to allow claims from victims of sexual offences perpetrated online, specifically non-consensual intimate image abuse”.

Commons Science, Innovation & Technology Committee Inquiry into the Resilience of the UK's Critical National Infrastructure (CNI)

Identifying the UK as the third most cyber-attacked country after the USA and Ukraine, the Committee’s inquiry was intended to measure progress toward achieving recently announced CNI resilience targets by 2025, and investigate what support the sector needs to achieve those targets and efforts to make computer hardware architecture more secure by design.

The Committee had published written evidence and heard oral evidence, which emphasised the importance of supply chain security implementing security by design and default, the provenance of components and the need for the UK to develop an independent manufacturing base.

The threats posed by the current global geo-political situation seem unlikely to subside in the near future, and with both the UK defence and health sectors, which form part of the UK’s CNI, having been impacted by cyber attacks, which in the context of health this has resulted in delays to the delivery of essential services, cyber security should continue to be high on the agenda of the new government.  

Should you require support understanding how new legislation and recommendations will affect you, contact us.