
- Practice Management
AI Scribes Pose Liability Risks
Before your medical practice decides to use AI scribe tools, understand the liability risks and our recommended practical risk mitigation strategies.
Artificial intelligence (AI) scribes and similar AI administrative support tools are being hailed as the way to save clinicians from “pajama time,” the practice of completing documentation and other administrative tasks outside work hours. Using ambient listening technology, AI scribes can “sit in” on encounters and other interactions with patients and, when prompted, generate drafts of clinical notes.
Some AI products are billed as all-in-one communications software that manages phone, text, and other patient-clinician or patient-health system communications. This technology can generate and analyze entire transcripts of phone or in-office conversations.
Researchers say such tools offer benefits such as:
-
- reduced clinician burnout due to decreased administrative burden and
- improved patient experience/engagement because clinicians focus on patients instead of charting.
- reduced clinician burnout due to decreased administrative burden and
In these days of staffing challenges and cumbersome regulatory requirements, practices across the U.S. are increasingly turning to these tools to save time.
Although these products may offer time-saving features, they also pose medical professional liability risks that are not always obvious to clinicians and practices. Risks arise in several main areas including:
-
- Data privacy;
- Data security and retention; and
- Medical record inaccuracy.
- Data privacy;
Whether you are already using AI-powered administrative tools, currently shopping products, or just starting to think about it, the first step towards managing liability exposure is understanding the various risks posed by this technology. In this article we identify some potential risks and recommend practical risk mitigation strategies.
Privacy and Security Risks of AI Scribes
Recording Risks - Wiretapping
A pending lawsuit in the dental setting illustrates the risks of software that records patient interactions. Heartland, a dental support organization (DSO) that contracts with dentists nationwide to provide support services like billing, staffing and marketing, faces a lawsuit filed by patients who claim it violated federal wiretapping laws1 by using an AI tool to allow a third-party phone vendor to “eavesdrop” and record incoming calls.
Plaintiffs’ Complaint includes the following allegations:
-
- Defendant Heartland is a DSO that contracts with over 2,800 dentists nationwide;2
- Heartland contracts with RingCentral, a cloud-based phone service provider, to supply phone and call center services for the dental offices Heartland serves.3
- Heartland used an AI product offered by RingCentral that listens to and analyzes patient phone calls in real time. It creates a transcript of the conversation, summarizes calls, conducts sentiment analysis to determine a caller’s emotional tone, and generates keywords and phrases from the call.4
- “Patients calling a local dental office are not informed that an unknown third-party (in this case, the provider of the telephone service, RingCentral) is listening in on the calls and analyzing them using artificial intelligence without the patients’ knowledge and consent.”5
- The contract between RingCentral and Heartland also allows RingCentral to use these recordings (containing patient data) to improve its AI product. Patients did not consent to RingCentral using the content of their conversations to develop or improve its products and services.6
- Defendant Heartland is a DSO that contracts with over 2,800 dentists nationwide;2
Plaintiffs assert that, based on these facts, Ring and Heartland violated federal wiretapping laws because they intentionally used a “device” (Ring’s software) that allowed a non-party to the conversation (Ring) to eavesdrop on or “intercept” the patients’ wire, oral, or electronic communications with Heartland.7
Plaintiffs also claim that because the recorded conversations contained “individually identifiable health information” (IIHI), disclosure of this information to unannounced third-party Ring (without patient consent) violated HIPAA.8
Recording Risks - State Laws
In addition to federal wiretapping statutes, there are state-specific laws governing recording. Clinicians and their practices should utilize legal counsel to understand and comply with the laws in their state.
Some states like Arizona9 and Utah10 are “one party consent” states, meaning recording is allowed if at least one party to the conversation knows of and consents to the recording (usually the person doing the recording). Other states like Montana prohibit the use of a “hidden electronic or mechanical device” to record a conversation unless all parties to the conversation are told that recording is occurring.11
Regardless of state, if you are using ambient listening tools, MICA’s Risk Team recommends notifying patients that AI is “listening” and briefly explaining how the tools work. Consider posting notices in common areas and exam rooms. To mitigate risk, obtain and document individual patient consent to the use of these tools.
Data Privacy Risks – HIPAA and State Data Privacy Laws
The HIPAA Privacy Rule protects patients against improper use or disclosure of their individually identifiable health information (IIHI). In addition to HIPAA, certain state privacy laws may restrict clinicians’ use or disclosure of sensitive data.
Clinicians who use AI software may run afoul of these laws if they fail to implement necessary data use restrictions. For example, HIPAA generally prohibits use or disclosure of protected health information without patient consent for any purpose other than treatment, payment, or health care operations. As a result:
-
- You likely need a HIPAA-compliant business associate agreement with your AI software vendor.
- Agreements permitting AI software vendors to use patient data (contained in recordings, for example) to “improve” or “train” their products potentially violate HIPAA unless patients have consented.
- You likely need a HIPAA-compliant business associate agreement with your AI software vendor.
Data Security
The HIPAA Security Rule requires clinicians and medical practices to implement “reasonable and appropriate” technical, administrative, and physical safeguards to protect patient data against impermissible disclosures. Platforms that host AI-powered tools contain a wealth of patient data in the form of draft clinical notes and recorded patient-clinician encounters. They often link directly to the EHR through an application programming interface (API), making them extremely attractive to cybercriminals.
To reduce the risk of data breaches and ensure HIPAA compliance, when you incorporate such software in your practice, you should complete an updated HIPAA Security Risk Assessment and modify your Security Risk Management Plan accordingly. For more on this topic, see MICA’s HIPAA Security Risk Assessment Checklist and Requirements.
Data Retention and Sharing Considerations
Clinicians and practices using AI-powered administrative support products that record conversations should understand whether the vendor retains these recordings and, if so, for how long. Various issues may be raised by storage or sharing of some or all recordings.
-
- For example, if your phone system records all calls, how are confidential conversations (such as clinicians’ conversations with defense attorneys) protected from inadvertent disclosure that could destroy the privilege?
- How is retention of recordings managed? Some conversations you would rather not have a record of in case a lawsuit arises, but sometimes you may be required to retain recordings for regulatory purposes or in litigation hold scenarios.
- Does the vendor permit the clinician to access stored recordings, particularly for use in billing disputes or the defense of lawsuits or board complaints? If so, is the retention period based on the applicable statute of limitations?
- For example, if your phone system records all calls, how are confidential conversations (such as clinicians’ conversations with defense attorneys) protected from inadvertent disclosure that could destroy the privilege?
Data Privacy and Security - Risk Management Tips
Medical practices should scrutinize contracts with AI software vendors and consider retaining legal counsel to review the contracts before signing. Pay close attention to vendors’ terms of service and privacy policies. Negotiate out unacceptable/risky provisions or consider using a different vendor.
Develop and implement an AI policy that governs your practice’s use of tools like AI scribes. Consider including the following provisions in your policy to minimize data privacy and security risks:
-
- Requirement for HIPAA-compliant business associate agreements with vendors of AI-powered administrative tools;
- Method for notifying patients and other third parties that the practice uses AI-powered tools that “listen to” communications;
- Processes for obtaining and managing consent to the use of these tools;
- Requirement that the practice complete an updated HIPAA Security Risk Assessment when the practice implements new software; and
- Policy prohibiting clinicians and employees from using their own AI-powered software on personal phones and devices during patient encounters.
- Requirement for HIPAA-compliant business associate agreements with vendors of AI-powered administrative tools;
Beware of AI Hallucinations: Medical Record Accuracy
Long before AI scribes existed, dictation, transcription, and speech recognition software were causing problems with medical record accuracy. Sometimes the transcriptionist made a typing error. Other times, the software or the transcriptionist misunderstood what the dictator said. These days, when AI makes an error, we call it “hallucination.”
Whatever the cause, it is always the clinician’s duty during the proofreading/electronic signature stage to catch and correct errors before they become part of the permanent medical record. Failure to do so can jeopardize future patient care or lead to adverse outcomes when the mistakes are clinically significant. For example, consider the risk posed by a pre-operative, informed consent or H&P note that erroneously references the upcoming surgery as left-sided instead of right-sided.
In addition, documentation that is unreadable due to grammatical/spelling errors or sentences that are just plain non-sensical puts clinicians in a bad light with board investigators or jurors who later read the documentation. Documentation plays an important role in the defensibility of any malpractice case or board complaint, and documentation errors weaken a clinician’s defense. Some cases with good medicine are settled because of poor documentation. Jurors and investigators may believe documentation rife with errors indicates a lack of attention to detail which may influence their decisions regarding whether the clinician met the standard of care.
Risk Management Tips:
-
- If you are using AI products to generate clinical documentation, it is essential to devote sufficient uninterrupted time to proofreading the final product for clinical accuracy, completeness, and readability.
- Consider too that licensing board disciplinary decisions frequently fault physicians for failure to document their thought process or rationale. AI scribes may not include this information unless you dictate it during the encounter, so make sure to assess whether the documentation includes this essential element.
- If you are using AI products to generate clinical documentation, it is essential to devote sufficient uninterrupted time to proofreading the final product for clinical accuracy, completeness, and readability.
Reap the Benefits, Mitigate Risks
AI may improve efficiency and patient engagement, offering benefits to clinicians, medical practices, and patients. To get there, clinicians must take the time to understand how the products they use work and the types of professional liability and patient safety risks those products pose. Implementing risk mitigation processes can help clinicians ensure that the risks don’t cancel out the rewards.
[1] 18 U.S.C. § 2510 et seq
[2] Lisota v. Heartland Dental LLC et al, Case No. 25-cv-7518, In the U.S. District Court for the Northern District of Illinois, Eastern Division, Complaint at para. 1
[3] Id. at paras. 2-3
[4] Id. at para. 2
[5] Id. at para. 3
[6] Id. at paras. 27-29
[7] Id. at paras. 43-51.
[8] Id. at para. 49.
[9] A.R.S. § 13-3012(9)
[10] UT Code § 77-23a-4
[11] MT Code § 45-8-213