As AI shows up in doctors' offices, most patients are giving permission as experts advise caution


Artificial intelligence has been used “behind the scenes” in health care for decades, but with the growing popularity of new technologies such as ChatGPT, it’s now playing a bigger role in patient care — including during routine doctor’s visits.

Physicians may rely on AI to record conversations, manage documentation and create personalized treatment plans. And that raises the question of whether they must get patients’ permission first to use the technology during appointments.

“While regulations may vary by jurisdiction, obtaining informed consent for using AI is often considered best practice and aligns with the principles of medical ethics,” Dr. Harvey Castro, a Dallas, Texas-based board-certified emergency medicine physician and national speaker on artificial intelligence in health care, told Fox News Digital. 

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

“It ensures transparency and respects patient autonomy,” he added. 

“Regulatory bodies and health care institutions may provide specific guidelines.”

Augmedix AI app

As more physicians rely on AI to record conversations, manage documentation and create personalized treatment plans, should they get patients’ permission to do so? The debate is under way. (Augmedix)

Augmedix, a medical technology company in San Francisco, offers solutions that allow doctors to capture documentation using ambient AI technology.

“We repurpose the conversation that occurs between a doctor and a patient, and use that as the basis for creating a medical note, which is required for every patient visit,” CEO Manny Krakaris said in an interview with Fox News Digital.

Manual documentation by physicians, on the other hand, can consume up to a third of their day, Krakaris said. 

NEW AI ‘CANCER CHATBOT’ PROVIDES PATIENTS AND FAMILIES WITH 24/7 SUPPORT: ‘EMPATHETIC APPROACH’

“That’s a lot of wasted time spent on administrative tasks, which could be applied to spending more time with their patients and interacting with them on a very human level,” Krakaris said.

The AI technology can also help reduce physician burnout, Krakaris noted, as it can allow doctors to reduce their workload and spend more time with their families.

Manny Krakaris

Manny Krakaris is CEO of Augmedix, a San Francisco medical technology company that offers solutions for doctors to create documentation using ambient AI technology. He said AI tech can help reduce physician burnout. (Manny Krakaris/Augmedix)

Based on surveys from Augmedix’s clients, patients generally feel a greater sense of satisfaction when the doctor pays full attention and listens to them during the visit, rather than being distracted or diverted by a computer screen.

In terms of HIPAA compliance with AI-generated documentation, things can get a little murky.

“Artificial intelligence wasn’t even a term when HIPAA was created, so it has some catching up to do.”

“HIPAA does not specifically require patient consent for the use of AI — artificial intelligence wasn’t even a term when HIPAA was created, so it has some catching up to do,” Krakaris said.

AI-generated documentation is permissible under HIPAA, he said, “as long as the intent is to use it to contribute to generalized knowledge — and that’s typically how this is used.”

NEW AI TOOL HELPS DOCTORS STREAMLINE DOCUMENTATION AND FOCUS ON PATIENTS

There are also individual state laws that govern patient privacy, Krakaris noted, and physicians must adhere to those whether they’re using AI or not.

Among Augmedix’s clients, Krakaris said that obtaining AI consent is generally part of the patient intake process. 

“It will vary from one enterprise to another in terms of how that is done,” he said. 

Doctor with woman

When physicians do their own documentation manually, it can consume up to a third of their day, Krakaris pointed out.  (iStock)

Some practices require only verbal consent to use ambient technology to help generate the medical note, while others require written consent.

Overall, most patients are open to the use of AI in the doctor’s office, with the typical opt-in rate across all of Augmedix’s customers averaging about 99%, the company said. 

“So there hasn’t been any kind of widespread hesitation on the part of patients to use AI or to take advantage of this technology,” said Krakaris, sharing his experiences. 

WHAT ARE HIPAA RIGHTS?

Some patients, however, may have concerns about privacy, data security or the impersonal nature of AI, Castro pointed out. 

“Ethical considerations, mistrust of technology or cultural beliefs may also deter consent,” he told Fox News Digital. 

“It’s essential to address these concerns with empathy and integrity, and I always ensure that patients understand that I do not violate HIPAA laws,” he added. 

Dr. Harvey Castro

Dr. Harvey Castro is a board-certified emergency medicine physician and national speaker on artificial intelligence in health care in Dallas, Texas. He said people of different ages often respond to technology differently — and stressed the need for “patient-centered care.”  (Dr. Harvey Castro)

People in different age groups often respond to technology differently, Castro said.

“Education is vital to addressing concerns about AI,” he said. “It’s realistic and ethically responsible for physicians or health care staff to provide clear explanations and education about AI’s role in care.”

“This fosters trust and empowers patients to make informed decisions.”

NEW AI TECH AIMS TO DETECT THE ORIGIN OF CANCERS FOR OPTIMAL TREATMENTS: ‘AN IMPORTANT STEP’

In Krakaris’ view, physicians are the best source for explaining the use of the particular technology, because it happens during the encounter with patients. 

“That’s the perfect time to do it — at the point of care,” he said.

“Emphasizing transparency, informed consent and education ensure that AI can enhance, not replace, the human touch in medicine.”

Each of Augmedix’s clients also gets a one-page laminated description that fully explains what the AI does and how patient data is protected.

“After they have a chance to review that, they’re asked whether they opt in or not,” Krakaris said.

There are risks associated with “blindly relying” on using large language models to summarize the doctor-patient experience, Krakaris said — especially given the current shortage of health care providers.

Doctor with patient

Based on surveys from Augmedix’s clients, patients generally feel a greater sense of satisfaction when the doctor pays full attention and listens to them during the visit. (iStock)

“The large language models are prone to errors — it’s been widely documented,” he said. 

“And so you need to provide guardrails to ensure that those errors are removed from the final medical note,” he also said. 

In Krakaris’ view, that guardrail is human judgment. 

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

“A human expert needs to apply their expertise to that final product,” he said. “The technology isn’t nearly good enough today to be able to do that.”

As the integration of AI in health care continues, Castro stressed the need for a commitment to “ethics, integrity and patient-centered care.”

CLICK HERE TO GET THE FOX NEWS APP

“Emphasizing transparency, informed consent and education will ensure that AI can enhance, not replace, the human touch in medicine.”

Leave a Reply

Your email address will not be published.