Macquarie University
01whole.pdf (872.18 kB)

Artificial intelligence in medicine: how do we determine legal liability when things go wrong?

Download (872.18 kB)
posted on 2023-02-22, 01:43 authored by Paul Nolan

Medical negligence is one of the most complex areas of tort law. This complexity gives rise to difficulties when an injured patient seeks to prove that a treating clinician was negligent, whether by act or omission, and how any purported negligence has deleteriously impacted on them.  

The introduction of novel medical technology, such as Artificial Intelligence (AI), into traditional clinical practice presents legal liability challenges that need to be squarely addressed by litigants and courts when something goes wrong. Some of the most promising applications for the use of AI in medicine will lead to vexed liability questions. As AI in healthcare is in its relative infancy, there is a paucity of case law globally upon which to draw from.  

The thesis will analyse medical malpractice where AI is involved, what problems arise when applying the tort of negligence — such as establishing the essential elements of breach of duty of care and causation — and how can these be addressed. In order to address this question, the thesis will: 1) identify the general problems that black box AI causes in the healthcare sector; 2) identify the problems that will arise in establishing breach and causation due to the ‘black box’ nature of AI, with reference to the Civil Liability Act 2002 (NSW) and common law through two hypothetical examples; and 3) consider selected legal solutions to the problems caused by ‘black box’ AI, namely: a) a res ipsa loquitor argument, that is the principle that negligence may be inferred if certain criteria are satisfied; b) invoking Section 5D(2) of the Civil Liability Act 2002 (NSW) (‘an exceptional case’); and c) a ‘no fault’ insurance scheme.  

What will ultimately be demonstrated is that problems will arise in relation to establishing breach of duty and causation due to the black box effect. Those issues, however, are not insurmountable and the legal solutions selected may assist patients and clinicians alike. Due to the infancy of this area of law though, they are yet to be tested by a court and they will face challenges. Further topics such as product liability, expert evidence, and peer expert opinion are also identified as warranting further research.


Macquarie University HDR scholarship


Table of Contents

Chapter I Introduction -- Chapter II Artificial intelligence and the black box -- Chapter III Liability for harm in scenarios involving medical AI -- Chapter IV Possible solutions -- Chapter V Conclusion

Awarding Institution

Macquarie University

Degree Type

Thesis MRes

Department, Centre or School

Macquarie Law School

Year of Award


Principal Supervisor

Rita Matulionyte


Copyright: The Author Copyright disclaimer:




91 pages

Usage metrics

    Macquarie University Theses


    Ref. manager