Peer Review V
“As well as government amendments to the Data (Use and Access) Bill at Report stage, including to reflect higher protections for children, and several concessions to ward off divisions, the government was defeated in relation to several amendments, including to provide protections for copyright against data mining which has been used to gather data for the training validation and testing of AI models. ”
The Government came under sustained pressure throughout the Grand Committee scrutiny stage of the Data (Use and Access) Bill in relation to its position on artificial intelligence, the protection of children, the scope of discretion to be afforded to the Secretary of State, the Information Commissioner’s performance on regulatory enforcement, and the Information Commission’s role and independence. In addition, the opportunity was seized upon by peers to pursue other policy objectives, including proposals to amend the Computer Misuse Act 1990, to criminalise deepfake sexually explicit content, the presumption of reliability of computer evidence and the protection of copyright against web scraping for the purpose of training AI models. Read about the previous Grand Committee scrutiny sessions on 03 December, 10 December, 16 December and 18 December 2024.
The Bill returned to the whole House for consideration at Report stage on 21 January 2025 and again on 28 January 2025, when the Government was defeated in relation to several amendments to the Bill.
Report Stage 21 January 2025
Conservative peer Lord Lucas was successful in securing several amendments to Part 2 Data (Use and Access) Bill, which relates to Digital Verification Services and would require the Secretary of State to implement various mechanisms “to secure the reliability of digital verification services”, comprising: a DVS trust framework setting out rules concerning the provision of digital verification services; supplementary codes; the establishment and maintenance of a DVS register of persons providing digital verification services; a code of practice pertaining to the disclosure of information by public authorities to persons registered on the DVS register relating to an individual who has requested the provision of digital verification services for the purpose of enabling the delivery of those services; the designation of a trust mark for use by registered persons in the course of providing, or offering to provide, digital verification services; and, the - at least - annual review of the framework and supplementary codes and the conduct of an annual review of its operation:
Lord Lucas’ Amendment 6 amends clause 28 to require the Secretary of State to assess whether the personal data attributes collected, recorded and shared by public authorities including the DVLA, HMRC, His Majesty’s Passport Office, the NHS etc were reliably ascertained;
Lord Lucas’ Amendment 8 amends clause 45 to impose obligations on public authorities disclosing personal data to verify its accuracy and to accompany the disclosure with relevant metadata;
Viscount Camrose’s Amendment 11 amended Part 3 clause 56 to require the Secretary of State to provide guidance to relevant stakeholders on cyber security before permitting them to obtain information from the National Underground Asset Register (NUAR).
Viscount Colville’s Amendment 14 amends the government’s definition of scientific research at clause 67 Part 2 Data (Use and Access) Bill to provide that the research must be in the public interest, which has implications for data re-use.
Various Government amendments were agreed:
Amendment 12 introduces technical amendments to clause 58, Part 3 Data (Use and Access) Bill, which would require the Secretary of State to establish and maintain a register of apparatus in streets in England and Wales to be known as the National Underground Asset Register (‘NUAR’), creating an offence should an undertaker (within the meaning of New Roads and Street Works Act 1991) fail to upload relevant information into NUAR within the relevant time period, making provision for monetary penalties and, enabling the Secretary of State to make regulations relating to the availability of the information within the NUAR, requiring undertakers to pay fees in connection with the Secretary of State’s functions under Part 3 of the Bill and requiring undertakers to provide information to enable the fees to be determined; and,
Amendment 18, which amends clause 70 governing the Secretary of State’s powers to make regulations establishing new legitimate interests under the proposed Article 6(1)(ea) to make clear that children “merit special protection” and this must be considered before making regulations.
While government Amendment 49, to introduce a a soft opt-in for direct marketing by charities, was referenced it was not considered until the subsequent session on 28 January.
The government made concessions to secure the withdrawal of Baroness Kidron’s Amendment 15 to clause 68 to exempt children from deemed consent to the further use of their personal data for scientific research. In relation to her Amendment 22, which would augment the Age Appropriate Design Code aka the Children’s Code to impose an obligation on all data controllers and processors to have regard to children being entitled to a higher standard of protection and would give direct effect to the UN Convention on the Rights of the Child, the government indicated that it was willing to work with peers to introduce a version of the amendment targeted at online services directed to or likely to be accessed by children, as well as to consider whether the regulator’s duties could be enhanced.
The government also committed to requiring the data protection regulator to produce a “code of practice on AI and solely automated decision-making” which would include “guidance about protecting data subjects, including children” and that this would be “an early priority, following the Bill receiving Royal Assent” which would be achieved through secondary legislation to be approved by both Houses of Parliament in order to secure the withdrawal of Lord Clement-Jones’ Amendment 33 which would have removed clause 80 and the government’s proposals to relax restrictions on automated decision-making.
Other matters pursued by peers but which were not ultimately pushed to a vote or were rejected by the House upon a vote included:
Baroness Kidron’s amendments pertaining to the establishment of data communities;
Lord Clement-Jones’ amendment to require the Secretary of State to lay the Digital Verification Services framework before Parliament (Amendment 7);
Lord Clement-Jones’ Amendment 17 to remove the proposed power for the Secretary of State to make regulations establishing new recognised legitimate interests, as a consequence of concern that the Government was “intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees”;
Lord Clement-Jones’ Amendment which would have required the government to conduct an assessment of the likely impact of the Data (Use and Access) Bill and other legislative changes and international commitments for the European Commission’s renewal of its adequacy decision in respect of the UK;
Viscount Camrose’s Amendment 26, which attempted to import the 5 principles from the government’s AI regulation White Paper into clause 80 which otherwise relaxes protections for automated decision-making, was rejected.
Baroness Harding’s Amendment 24 which sought to relax the application transparency requirements under Article 14 UK GDPR in relation to personal data on the Open Electoral Register which is combined with other data to create a profile used for direct marketing;
Viscount Camrose’s Amendment 32 which would have expanded on the requirement for human intervention in automated decision-making that this must be “carried out by a person with sufficient competency and authority”; and,
Lord Clement-Jones’ Amendment 34, and Baroness Freeman’s supplementary Amendment 35, which would impose obligations on public sector organisations using algorithmic or automated decision-making systems (elements of which are currently addressed by the government’s non-statutory Algorithmic Transparency Recording Standard (ATRS)).
Report Stage 28 January 2025
Consideration of the Data (Use and Access) Bill resumed at clause 90 and the new data protection regulator’s duties.
Once again, certain government amendments to the Data (Use and Access) Bill were agreed:
Amendment 40 to reflect at clause 90 in relation to the regulator’s duties in carrying out its functions that children merit special protection;
Amendment 49 to introduce a soft opt-in to direct marketing by email by charities.
The government conceded that it would “use powers under the Data Protection Act 2018 to require the ICO to publish a new code of practice addressing edtech issues” in order to secure the withdrawal of Baroness Kidron’s Amendment 44 calling for the introduction of a Code of practice on Children's Data and Education.
The government also committed to bringing forward further amendments at Third Reading to criminalise the creation of sexually explicit deepfakes further to campaigning by Baroness Owen and addressing or otherwise taking into account Amendments 69 and 70.
The government was defeated in relation to Baroness Kidron’s Amendment 44A to clause 94 to require that the regulator’s at least annual analysis of its analysis of the Commissioner’s performance using key performance indicators include its performance in relation to enforcement under clauses 96 - 104 Data (Use and Access) Bill.
Significantly, Baroness Kidron’s Amendments 61, 62, 63, 64 and 65 were agreed. Amendment 61 will require the Secretary of State to make regulations specifying that the operators of web crawlers and general purpose AI models whose services have links to the UK within the meaning of s.4(5) Online Safety Act 2023 to comply with UK copyright law. Amendment 62 would require the Secretary of State to make regulations requiring the operators of web crawlers and general purpose AI models to ensure crawlers were identified. Amendment 63 would require regulations to be made obliging the disclosure of the training data for AI models. Amendment 64 makes provision for the data protection regulator to enforce the relevant obligations and affords the regulator powers in order to do so. Amendment 65 would require the Secretary of State to conduct a review of the technical solutions to prevent and to identify the unauthorised scraping or other unauthorised use of copyright owners’ text and data and to report within 18 months. While intended to provide protections for the creative sector and other industries against the proposed amendments to the UK’s copyright framework proposed by the government in its consultation on copyright and artificial intelligence, which is still ongoing, they effectively support the foundations of the establishment of a data mining/web scraping exception. The government’s agreement to them while its consultation remains ongoing may suggest that the government intends to forge ahead with its proposals, which would risk the perception that the consultation exercise is a sham and breaches the Gunning principles.
The government was again defeated in relation to Lord Lucas’ Amendment 67 to require the government to make regulations establishing the definitions and associated metadata for core personal data attributes.
Other matters pursued by peers but which were not ultimately pushed to a vote or were rejected by the House upon a vote included:
Lord Holmes’ Amendment 38 which sought to restrict the scope of the regulator’s obligation to have regard to innovation to the existing obligation at section 108 of the Deregulation Act 2015;
Lord Holmes’ attempt to utilise the Bill to implement Amendment 47 to amend the Computer Misuse Act 1990 to introduce defences to the criminal offences under the Act where the actions were necessary for the prevention or detection of crime or otherwise in the public interest. We have previously set out our significant concerns regarding proposals to introduce defences to the CMA;
Lord Clement-Jones’ Amendment 46 which would require a review to be carried out of the impact of transferring jurisdiction for data protection disputes from the courts to tribunals;
Lord Lucas' Amendment 48A which would introduce definitions of the terms ‘service message’ and ‘regulatory communication’
Lord Clement-Jones’ Amendment 48B which would override the Information Commissioner’s ‘Consent or Pay’ guidance and ban cookie paywalls;
Baroness Kidron’s Amendment 51 to clause 123 amending the Online Safety Act 2023 which would oblige the Secretary of State, rather than merely giving the power, to make regulations requiring providers of regulated services to provide information for purposes related to the carrying out of independent research into online safety matters within 12 months, although the government did commit to consulting on a researcher access framework as soon as possible after Ofcom reported which was expected in July 2025;
Lord Bassam’s proposal at Amendment 57 to introduce an annual private copy levy on accessing and storing online digital content, which would give effect to the April 2024 recommendation of the Culture, Media and Sport Committee;
Baroness Kidron’s Amendment 58 to establish the concept of sovereign data assets and a licensing regime in respect of them;
Lord Holmes’ Amendment 49 to require the Secretary of State to launch a review to consider the introduction of standards for the input and output of data of large language AI models (LLMs) operating and generating revenue in the UK (Lord Holmes had previously introduced an Artificial Intelligence (Regulation) Bill);
Lord Holmes’ Amendment 66 calling for a consultation on the implications of the Bill for data centre power usage;
Baroness Kidron’s Amendment 68 on the reliability of computer-based evidence, which is the subject of ongoing consultation;
Viscount Camrose’s Amendment 73 on the risks posed to data by systemic competitors and hostile actors; and,
Lord Clement-Jones’ Amendment 74 on whether the Bill’s provisions would have retrospective effect such that it would apply to data already being processed.
Access our comprehensive briefing on the Data (Use and Access) Bill, and our unofficial Data (Use and Access) Bill Keeling schedules showing a mark up of the changes that the Bill (as introduced) would make to the UK GDPR, Data Protection Act 2018 and Privacy and Electronic Communications Regulations 2003 (PECR) respectively.
Keep up to date with developments as the Data (Use and Access) Bill progresses through Parliament on our Data Protection Reform page in our Resources section.
Should you require support understanding how new data protection legislation and regulation will affect you or your organisation, please contact us.
Find out more about our data protection and data privacy services.