We often receive inquiries about the legality of recording the interactions between the provider and the patient. Most of these questions relate to a patient or patient’s family member secretly recording the encounter with their provider and how to prevent it. However, with the popularity of Artificial Intelligence (AI) documentation software applications increasing in healthcare, so are the questions about the potential risks of its use. This article will attempt to address as succinctly as possible the primary issues of both types of recordings.
Let’s begin with the scenario of the patient or patient’s family member recording the encounter. The legality of a surreptitious recording depends on the laws of the state where the recording occurred. Specifically, it is dependent on whether the state is a “one-party consent” state or a “two-party consent” state.
Most states are one-party consent states. As the name implies, only one party to the conversation must consent to the recording in these jurisdictions. If the patient (or family member) is a party to the conversation, they may record it without having to tell the other party (the provider). Since it is legal to make the recording, it likely would be admissible in court. A two-party consent state, on the other hand, requires that both parties to the conversation expressly consent to the recording.
Even though there are potential benefits to the patient/caregiver having a recording of the encounter to aid in recollection of a discussed treatment plan thereby possibly increasing adherence, there are also potential risks in the event of an unfavorable outcome and/or claim. For example, if a provider makes a misstatement, error, or unprofessional comment, it will be recorded and likely become “Exhibit A” at a subsequent malpractice trial against the provider.
Another concern is that recordings can be manipulated or edited and, thus, become inaccurate. If the patient/family member is the one who makes the recording, only they would have access to this “record of the encounter,” and the physician/provider would not have their own copy to disprove any alteration or edit. There could also be potential privacy issues as well if the patient’s recording secretly captures the protected health information of other patients.
If the provider or practice decides not to permit audio-video recordings, surreptitious or otherwise, a formal written policy should be adopted stating that position. Patients/visitors should be notified that audio, video, and/or digital recordings of any type are prohibited on the premises by posting a notice on the practice website, adding signage in the reception/waiting area and other prominent areas of the practice, and obtaining a written acknowledgement/agreement from patients that they understand the policy – such a clause should be placed in the initial patient paperwork. As an alternative to making a recording, patients and caregivers should be encouraged to take notes of the treatment plan and provider instructions. Be aware even if all steps are taken, they may be insufficient to deter the patient/family from making a recording or to prevent such a recording from being admissible in court. In other words, it won’t be known if these efforts are successful until a court has ruled on the sufficiency of them in each individual case.
The best practices to avoid adverse consequences from a surreptitious recordings are to: always speak and act in a professional manner; remain calm and composed; avoid inappropriate statements or “jokes”, and; ensure that your verbal statements are consistent with your EHR documentation.
The second scenario we will address in this article is the use of AI or similar ambient listening technologies that perform or assist in the performance of clinical documentation. While AI is in use in other areas of healthcare, we are limiting our discussion to its use in recording patient encounters for purposes of documentation.
As the U.S. Food and Drug Administration (FDA) observed in late 2024, for a generative AI-enabled product “that may be meant to summarize a patient’s interaction with a health care professional, the possibility of that product hallucinating (‘false or misleading outputs’) can present the difference between summarizing a health care professional’s discussion with a patient and avoiding a new diagnosis that was not raised during the interaction.”[1]
It is the duty of the physician or provider to accurately, completely, and timely document the encounter. We are frequently asked, “What about a disclaimer stating that the medical record was prepared using AI and, therefore, may contain errors?” Disclaimers or statements intended to limit liability in cases of professional negligence are frowned upon by the courts and generally not upheld. We do not recommend the use of such disclaimers and if you would like more information, please contact our Risk Education department or an SVMIC Claims Attorney.
Another consideration is that it might be inappropriate to record some encounters such as those involving domestic violence, drug use, or illegal activity.
Additionally, the law imposes obligations to preserve medical documentation for a specified period of time. This raises the question of whether the jurisdiction where the provider practices requires that both the documentation transcribed in the EHR and the recording itself must be maintained.
Finally, there may be issues relating to privacy and/or HIPAA. Patients may want to know who can access the recording or AI summary and the potential identifiability of the patient. While most AI vendors advertise their programs to be HIPAA compliant this is something that must be considered.
Because the use of AI ambient listening software has only recently become widespread in healthcare, there are currently few laws, rules or regulations to provide guidance on these or other potential issues. We recommend that any time the provider is recording a patient encounter, especially with AI ambient listening, the patient be notified and written consent obtained.
If you would like to discuss this topic in greater detail, please contact our Risk Education Department.
[1] Executive summary for the Digital Health Advisory Committee Meeting: total product lifecycle considerations for generative AI-enabled devices, November 20-21,2024. 2024. US Food and Drug Administration. Accessed January 9, 2025. http://www.fda.gov/media/182871/download
The contents of The Sentinel are intended for educational/informational purposes only and do not constitute legal advice. Policyholders are urged to consult with their personal attorney for legal advice, as specific legal requirements may vary from state to state and/or change over time.