AI – Should I Use Artificial Intelligence as a Doctor?

Background

You may have used artificial intelligence (AI) tools yourself already, or observed colleagues doing so.  Adoption of AI systems is rising amongst doctors, but it’s largely ad-hoc: clinicians are discovering and using third-party platforms independently of institutional oversight (1,2).

There are many ways that a doctor might think to use AI in their work:

  • Using Large Language Models (LLMs) to write a patient’s management plan.
  • Employing voice-to-text software to dictate ward rounds or consultations for faster documentation.
  • Inputting an ECG or chest x-ray into image-analysis tools.
  • Relying upon AI-powered summaries as the top of search results to answer your questions.

Each of these applications of AI varies in the level of concern they raise about safety, confidentiality and clinical responsibility.

There is a lack of clear and comprehensive guidance for doctors on when, whether and how to use these tools. Despite widespread discussion on AI and its place in medicine’s future, policymakers, regulators and trusts have offered remarkably little direction on how individual doctors should use these technologies safely. This is made more complex by statements from Government and the NHS pushing the general need to utilise AI to address waiting lists and productivity challenges (3, 4). Meanwhile, the BMA has advised against the government’s suggestions, such as with ambient AI scribes, citing concerns about their use (5).

This leaves individual doctors navigating contradictory pressures: urged by government to embrace AI for productivity, warned by their professional body about safety concerns, yet offered minimal practical guidance. This article examines what limited guidance does exist and provides a framework for using AI safely.

GMC Guidance: Applying Existing Standards

As the body that regulates doctors and would likely be involved in cases of potential misuse of AI tools by doctors, the GMC’s guidance on the topic is unfortunately limited. Rather than providing specific frameworks for AI use, it relies on applying existing Good Medical Practice (GMP) standards to new technology.

Paragraph 89a of GMP can be applied to the risk of AI-generated errors in documentation (6):
“You must make sure any information you communicate as a medical professional is accurate, not false or misleading. This means you must take reasonable steps to check the information is accurate” 

Paragraph 71 addresses confidentiality issues, such as those that may arise from using third-party AI platforms (7):
“You must keep records that contain personal information about patients, colleagues or others securely, and in line with any data protection law requirements”

Their most focused resource on the topic to date, Artificial intelligence and innovative technologies, does provide some further guidance for doctors, but it is limited in specificity and is only one of a number of topics discussed (8). The contained advice can be summarised as:

  • The GMC expects doctors to use their own professional judgement to apply principles in existing guidance to use to AI tools.
  • Doctors are responsible for the decisions they take when using new technologies like AI.
  • There is an expectation that as technologies such as AI continue to be developed, doctors will take part in education and training to be able to use them in clinical practice.

Medical Indemnity Organisations: MDU and MPS

As the organisations that provide many UK doctors with their professional indemnity, the Medical Defence Union (MDU) and Medical Protection Society (MPS) are important sources of legal advice for medical professionals. Failing to follow their advice may leave you without indemnity coverage, meaning you could face legal costs and damages personally.

Both of these organisations have provided more concrete guidance on the use of AI tools by individual clinicians. The key takeaways have been outlined below.

MDU – Using AI safely and responsibly in primary care (9)
  • Hallucinations are common and when included in your documentation, whether through transcription or summarisation errors, they must be corrected at the time of recording.
  • The removal of a patient’s name is not sufficient to anonymise clinical information provided to an AI tool as the combination of medical information, demographic information and the stored information about you as a user can still make it possible to identify a patient.
  • You must document a patient’s consent to the use of transcription software for consultations.
  • You should avoid incorporating AI systems into your individual clinical practice outside of your employer’s protections, as it puts you at significant professional risk.
  • The use of AI without your employer’s approval may form a breach of data protection legislation and your workplace policies.
MPS – Common medicolegal dilemmas healthcare professionals are facing with the use of AI (10)
  • In clinical negligence cases where AI tools were used, MPS members can request assistance as they would normally do so as long as the potential negligence results from their own clinical judgement or actions.
  • However, indemnity would not normally be provided to members where the issue results from a failure of the AI software itself “for example, if the software has been incorrectly programmed or developed”.
  • Final responsibility for the diagnostic and management decisions that clinicians make remains with the clinicians themselves.
  • Regarding generative AI and hallucinations, it is the clinician’s responsibility to ensure the accuracy of information that they provide to patients and that it is from a reliable source.
  • Clinicians are advised to be cautious about the use of AI tools to complete documentation that declares patients fit to participate in certain activities e.g. fit-to-fly letters
  • Doctors using generative AI must be mindful not to work outside the limits of their competence. Any scenarios in which a doctor feels that their knowledge or skill-set is not sufficient should be addressed by seeking a human colleague with the relevant expertise.
  • Doctors should not be tempted to defer to the output of an AI system to avoid or reduce their own liability for negative outcomes. They should be confident to reject outputs that they believe to be incorrect.

Best Practice

From the institutional advice outlined above, a set of rules of thumb appear. These allow us to consider how we might cautiously use AI in our work while avoiding the many pitfalls that come with the technology. The points below are derived from the sources discussed but their relevance and appropriateness are liable to change with the publication of new guidance from those organisations and others as well as with more focussed rulings by individual trusts:

  • Many AI tools are probabilistic (i.e. incorporates uncertainty and randomness) and therefore are at risk of unexpected mistakes such as the hallucinations in the case of generative language models. This is fundamentally different from the approach doctors are used to with trusted sources (e.g. BNF, CKS, etc.) and deterministic (i.e. produces the same output for a given input) tools such as MDCalc.
  • Compared to traditional digital software, it is extremely easy to unknowingly fail to meet legal and professional expectations when using AI tools. You should review whether your own trust/employer has published rules regarding the use of AI systems and limit use in your clinical practice to that. This can avoid issues surrounding data protection and liability.
  • Avoid including any patient information in third-party AI systems not approved by your organisation as even non-identifying information can qualify as a data protection breach.
  • Avoid using publicly available AI tools in your clinical tasks without the express permission/guidance from your employer as doing so can put you at professional and legal risk.
  • Where you are using approved tools for documentation, you are responsible for any resulting inaccuracies. Review all outputs and where necessary clarify the accuracy with patients.
  • Assume that you have the final responsibility for the actions taken on the advice or with the assistance of AI tools. This includes where it makes a mistake or hallucinates, as you are expected to correct it. Where you disagree with the conclusion of an AI system, you should feel confident in rejecting it in favour of your own judgement.
  • The GMC expects doctors to apply existing guidance from Good Medical Practice to their decisions around the use of AI.

References

  1. Nuffield Trust. (2025). How are GPs using AI? Insights from the front line. [online] Available at: https://www.nuffieldtrust.org.uk/research/how-are-gps-using-ai-insights-from-the-front-line.
  2. ‌Blease, C.R., Locher, C., Gaab, J., Hägglund, M. and Mandl, K.D. (2024). Generative artificial intelligence in primary care: an online survey of UK general practitioners. BMJ Health & Care Informatics, 31(1), pp.e101102–e101102. doi:https://doi.org/10.1136/bmjhci-2024-101102.
  3. ‌Health (2025). AI doctors’ assistant to speed up appointments a ‘gamechanger’. [online] GOV.UK. Available at: https://www.gov.uk/government/news/ai-doctors-assistant-to-speed-up-appointments-a-gamechanger.
  4. ‌Health (2025). Major NHS AI trial delivers unprecedented time and cost savings. [online] GOV.UK. Available at: https://www.gov.uk/government/news/major-nhs-ai-trial-delivers-unprecedented-time-and-cost-savings.
  5. ‌Lovell, T. (2025). BMA warns GPs on the ‘substantial’ risks of AI scribing tools. [online] Digital Health. Available at: https://www.digitalhealth.net/2025/06/bma-warns-gps-on-the-substantial-risks-of-ai-scribing-tools/ [Accessed 22 Jan. 2026].
  6. ‌GMC (2024). Domain 4: Trust and professionalism. [online] Available at: https://www.gmc-uk.org/professional-standards/the-professional-standards/good-medical-practice/domain-4-trust-and-professionalism#cooperating-with-legal-and-regulatory-requirements-82D869617EE54ABBACAC4FE4459DF7D5.
  7. ‌GMC (2024). Domain 3: Colleagues, culture and safety. [online] Gmc-uk.org. Available at: https://www.gmc-uk.org/professional-standards/the-professional-standards/good-medical-practice/domain-3-colleagues-culture-and-safety.
  8. ‌GMC (2024). Artificial intelligence and innovative technologies. [online] Available at: https://www.gmc-uk.org/professional-standards/learning-materials/artificial-intelligence-and-innovative-technologies.
  9. ‌MDU (2025). Using AI safely and responsibly in primary care – The MDU. [online] Available at: https://www.themdu.com/guidance-and-advice/guides/using-ai-in-primary-care.
  10. ‌MPS (2025). Common medicolegal dilemmas healthcare professionals are facing with the use of AI. [online] Available at: https://www.medicalprotection.org/uk/advice-centre-articles/common-medicolegal-dilemmas-healthcare-professionals-are-facing-with-the-use-of-ai.

How useful was this post?

Click on a star to rate it!

Average rating 5 / 5. Vote count: 1

No votes so far! Be the first to rate this post.

As you found this post useful...

Follow us on social media!

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Related Posts

Tile art
Mind The Bleep: The Podcast
Mental health and well-being are taboo subjects for medical students...
Ranking Foundation Jobs
Ranking Foundation Jobs
If you’re worried about not getting your top choice, you...
FY3
How to plan your FY3
FY3 refers to time spent out of training after completing the...

Leave a Comment

Your email address will not be published. Required fields are marked *

Follow us

Favourites

Newsletter

Trending Now

Resident Doctor's Pay Calculator 2025
We’ve created a pay calculator to help you better understand your salary, how much tax you’ll...
Understanding the MSRA
The Multiple Specialty Recruitment Assessment (MSRA) is a computer-based exam increasingly being used...
Passing the Prescribing Safety Assessment (PSA)
The PSA is aimed at final year medical students and those graduating overseas to assess their competency...
Common Viral Infections (exanthem) in Paediatrics
Viral infections are extremely common in paediatrics and a common presentation to paediatric A&E...
Consultant Doctor's Pay Calculator 2025
We’ve created a pay calculator to help you better understand your salary, how much tax you’ll...
Thinking about Australia?
Resident Doctors in the UK are increasingly moving to Australia after FY1, for FY3 or other years in...
Paracetamol Overdose
Paracetamol overdose is a common presentation in A&E and so you may often find yourself looking after...

Sign up for our awesome resources & exclusive discount codes!

Join 80,000+ users who have signed up for our free weekly webinars, referral cheat sheet, pay calculator & exclusive discount codes for Pastest, Quesmed, Medibuddy and many others!