A Bill to regulate the use of automated and algorithmic tools in decision-making processes in the public sector; to require public authorities to complete an impact assessment of automated and algorithmic decision-making systems; to ensure the adoption of transparency standards for such systems; and for connected purposes.
House of Lords
18 September 2025
May contain errors — check source documents for definitive information.
The bill would require public authorities to assess and publicly report on automated and algorithmic decision‑making tools they use. It would mandate pre‑deployment Algorithmic Impact Assessments (AIAs) and post‑deployment Algorithmic Transparency Records (ATRs), plus ongoing governance such as independent scrutiny, bias testing, logging, audits, staff training and human oversight, with avenues to challenge or redress. It applies in England and Wales (with some exclusions like national security or purely calculative tools) and would take effect six months after passage via regulations, backed by an independent dispute‑resolution service.
The bill began in the Lords and has completed several stages there (1st reading in Sept 2024, 2nd reading in Dec 2024, order of commitment discharged in Jan 2025, 3rd reading in Feb 2025) and is now at its 1st reading in the Commons, with further stages to follow.
Generated 21 February 2026
9 Sept 2024
13 Dec 2024
20 Jan 2025
7 Feb 2025
Third reading - the final chance for the Lords to change the bill - took place on 7 February and no amendments were made.
What happens next?
The bill now goes to the Commons for consideration.
Baroness Jones says the government is delivering transparency and public trust in public‑sector AI through the Algorithmic Transparency Recording Standard (ATRS), with 14 new records published and more to follow; ATRS will be mandatory for departments and ALBs for high‑impact, public‑facing tools. She also highlights GDPR safeguards (including Article 22) and reforms to the Data (Use and Access) Bill to enable responsible automated decision‑making, as the government plans to support safe and ethical AI adoption in public services.
The Committee highlights two main issues: first, in the Public Ownership Bill, a broad repealing power (to undo sections 30A/30B) could be misused and requires ministerial clarification on scope and procedure, with a preference for affirmative scrutiny unless tightly limited; second, in the Product Regulation and Metrology Bill, it condemns skeleton-style powers (clauses 1–3, 5–6 and 9) that would enable extensive, regulation-by-regulation controls—potentially including criminal-offence provisions—without adequate justification or scrutiny, and it calls for removing those delegations or providing a full rationale and stronger checks.
The Bill would require public authorities to assess and publicly report on algorithmic and automated decision‑making systems they develop or buy, including pre‑deployment Algorithmic Impact Assessments and post‑deployment Algorithmic Transparency Records. It introduces ongoing governance—independent scrutiny, mandatory bias assessment, logging, audits, staff training and human oversight—to promote fairness, privacy and due process, with an option to challenge or obtain redress. The Act applies to England and Wales (with some exclusions such as national security or purely calculative tools), comes into force six months after passage, and would be implemented by regulations laid by statutory instrument after consultation, alongside an independent dispute resolution service.
No recorded votes for this bill yet.