UK Regulator Seeks Input on AI Healthcare Rules

UK Regulator Seeks Input on AI Healthcare Rules

The Medicines and Healthcare products Regulatory Agency (MHRA) has launched a public consultation on how AI should be regulated in UK healthcare. The work is being driven by a new National Commission into the Regulation of AI in Healthcare, created to modernise rulemaking as AI tools move from research into everyday clinical use.

MHRA Launches Key Consultation

The MHRA is asking for views on a risk-proportionate approach that keeps patients safe while allowing responsible innovation. The Commission will examine where existing medical device and medicines rules work, and where additional safeguards are needed for systems that learn and change in real world settings. Lawrence Tallon, MHRA CEO, has highlighted the need to align regulation with how AI will be used across clinical pathways.

Defining Safe AI and Building Trust

At the centre of the consultation are patient safety and public confidence. Professor Alastair Denniston, chair of the National Commission, has stressed attention to not just algorithm performance but to deployment, monitoring, and roles and responsibilities across health systems. The review will consider transparency, validation in clinical settings, post-market surveillance, and how liability and governance should be allocated between developers, providers and regulators.

Broad Call for Participation

The MHRA invites input from patients and carers, clinicians, researchers, health service leaders, AI developers, industry bodies and the public. Responses will help shape guidance on certification, testing, data use, clinical oversight and reporting requirements. The deadline for submissions is 2 February 2026.

Conclusion

This consultation is a timely opportunity for stakeholders to influence a framework that balances safety, trust and practical adoption. UK actors who contribute can help create rules that protect patients while supporting scalable, clinically effective AI across the health sector.