Baylor Scott & White Adopts AI Ambient Listening Amid Privacy Concerns
Baylor Scott & White Health, Texas's largest not-for-profit healthcare system, has implemented ambient listening technology in its facilities, sparking unease among patients and families about potential privacy risks. Several LTC News readers have brought this program to our attention, expressing their concern or surprise about its implementation. The technology uses artificial intelligence (AI) to record and transcribe conversations between physicians and patients, automatically generating clinical documentation.
While promoted as a tool to reduce administrative burdens and enhance care, critics argue that such recording practices could compromise sensitive patient information and raise serious questions about data security and confidentiality.
The initiative comes as healthcare systems nationwide grapple with increasing demands on clinicians' time and the need to improve patient experiences. By automating the documentation process, ambient listening is intended to free physicians to focus more on direct patient care, potentially leading to improved outcomes and satisfaction.
Julie Smith, spokesperson for Baylor Scott & White Health, points out their physician's ability to more face-to-face attention during appointments.
We are one of many health systems across the country piloting the use of ambient listening tools, which are designed to provide patients with more face-to-face attention during clinic visits.
Through the technology, conversations are summarized in the electronic health record, reviewed for accuracy by the provider, and fall under the same privacy protections as other patient information. Signage about the use of the technology is posted in all rooms in which it is utilized.
We have so far received positive feedback from patients and providers. Signage about use of the technology is posted in all rooms in which it is utilized.
Potential Benefits and Privacy Concerns
Proponents of ambient listening technology highlight several potential benefits, particularly for patients with complex medical histories. In these cases, detailed and accurate documentation is crucial for ensuring continuity of care and effective treatment planning.
A staff member at Baylor Scott & White Health, who spoke on the condition of anonymity because they were not authorized to discuss internal policies, said the system has several benefits.
When a patient with several significant health issues is discussing their concerns with their doctor, it's essential to capture all the nuances of that conversation. This technology can help ensure that no critical details are missed, leading to more informed decision-making.
Moreover, ambient listening could potentially improve patient-physician communication by allowing doctors to maintain eye contact and engage more fully in the conversation rather than spending time typing on a computer. This could lead to increased patient trust and satisfaction.
However, the use of ambient listening technology also raises privacy concerns. Opponents of the program worry about the potential for off-the-cuff remarks to be taken out of context and included in medical records. They also express concerns about the security of patient data and the potential for unauthorized access.
A person familiar with the implementation of the ambient listening system, who requested anonymity due to the sensitivity of the topic, said the technology has raised questions among staff about patient consent and data security.
It's crucial that healthcare systems using this technology have robust safeguards in place to protect patient privacy. I hope this has been considered. Patients need to be fully informed about how their conversations are being recorded, stored, and used.
Baylor Scott & White Health acknowledges these concerns and emphasizes its commitment to patient privacy.
Through the technology, conversations are summarized in the electronic health record, reviewed for accuracy by the provider, and fall under the same privacy protections as other patient information.
Sources have told LTC News that Baylor Scott & White Health has implemented strict security measures to protect their data, and we are committed to transparency in our use of this technology.
Addressing the Challenges of Implementation
In addition to privacy concerns, other challenges are associated with implementing ambient listening technology. One challenge is ensuring the accuracy of the AI transcription. While the technology has made significant strides in recent years, it is not yet perfect. There is a risk that the AI may misinterpret or mis-transcribe certain words or phrases, leading to inaccuracies in the medical record.
Another source who requested anonymity and was familiar with the program at Baylor Scott & White Health told LTC News that it's important for physicians to carefully review the AI-generated summaries to ensure their accuracy.
This technology is a tool to assist physicians, not replace them.
Another challenge is integrating the technology into existing electronic health record systems. This requires careful planning and coordination to ensure a seamless workflow for clinicians.
Despite these challenges, Baylor Scott & White Health remains optimistic about the potential of ambient listening technology to improve patient care and reduce administrative burdens. The health system plans to continue piloting the technology and gathering feedback from patients and providers to inform its future use.
Medical Records Issues
One of the primary concerns surrounding the use of AI ambient listening in healthcare settings is the potential for misinterpretation and misrepresentation of patient-physician interactions. Natural human conversation is filled with nuances, including jokes, sarcasm, and off-the-cuff remarks. AI, while increasingly sophisticated, may struggle to accurately interpret these nuances, potentially leading to comments being taken out of context and inappropriately included in medical records.
For example, a patient might jokingly say to their doctor, "I'm so old, I'm forgetting my keys and where I am, heck, I'm just falling apart." While this is clearly a lighthearted comment, the AI could transcribe it as a statement of serious physical decline and memory concerns. Another doctor, unfamiliar with the patient, could potentially look at this person differently because of how AI interpreted the comments.
This problem isn't entirely new. Even without AI, doctors can sometimes misinterpret a patient's comment, taking a joke or lighthearted remark as a serious symptom or concern. This can lead to inaccurate information being recorded in the medical record. Such misinformation can have significant consequences, especially when a patient may see a doctor unfamiliar with them, perhaps in an emergency situation.
These issues also impact medical underwriting for insurance products like Long-Term Care Insurance and life insurance. Insurers rely on accurate medical records to assess risk and determine premiums. If a medical record contains misinterpreted information, it could lead to an unfair assessment of an individual's health and potentially result in higher premiums or even denial of coverage.
Moreover, the inclusion of such out-of-context remarks in medical records could have unintended consequences for patients. A misinterpreted comment could lead to misunderstandings, biases, or even discrimination.
To mitigate these risks, experts say healthcare systems using ambient listening technology must implement robust safeguards. These safeguards should include:
- Human review of AI-generated transcripts: Physicians should carefully review the transcripts generated by the AI to ensure accuracy and context.
- Clear guidelines for what should be included in medical records: There should be clear guidelines for what types of information should be included in medical records, and off-the-cuff remarks or jokes should generally be excluded.
- Patient education and consent: Patients should be fully informed about how their conversations are being recorded and used, and they should have the opportunity to review and correct any inaccuracies in the transcripts.
By addressing these concerns and implementing appropriate safeguards, healthcare systems can minimize the risk of misinterpretation and ensure that ambient listening technology is used responsibly and ethically.
Older Adults
The use of AI ambient listening technology presents unique considerations when applied to older patients and those in long-term care facilities. This population often experiences a range of cognitive and physical changes that can impact communication and interaction. For instance, older adults may struggle to articulate their thoughts or experience memory lapses during conversations.
These factors raise concerns about the accuracy and reliability of AI-generated transcripts in these settings. The AI may struggle to understand speech patterns or may misinterpret fragmented sentences or unclear pronunciation, potentially leading to inaccurate or incomplete medical records.
Tech's Goal to Improve Patient Care
As healthcare systems continue to explore innovative ways to improve patient care and efficiency, many experts say ambient listening technology represents a promising development. However, experts say these technologies must be implemented responsibly, with careful consideration of privacy concerns and potential challenges.
Prioritizing patient privacy, ensuring accuracy, and addressing implementation challenges, healthcare systems can harness the power of AI to enhance patient care and create a more efficient and effective healthcare system.
If you have a news tip to share with LTC News, email us at newsroom@ltcnews.com