Don’t Replace Your Pharmacist (or Any Other HCP) With ChatGPT

It seems almost magic, doesn’t it? A computer program or app like ChatGPT that can answer all of your questions, and even tell you where it found the information. You could ask what the fastest way to get from Montreal to Los Angeles. You could request some chocolate cake recipes. If there’s a mistake somewhere, it’s likely not going to affect your life too much. But asking ChatGPT or any other artificial intelligence (AI) program or app about your health could result in very serious consequences. Having too much baking powder in your cake isn’t as serious as two drugs interacting with each other in ways they shouldn’t.

So, can you, should you use ChatGPT? It depends on what you’re looking for. And if you are looking for something related to your health, you must verify everything. Or as Karin says, “Trust but verify!”

Looking for health info using ChatGPT
Photo by Beth Macdonald on Unsplash

ChatGPT fails when asked questions about drugs

Since ChatGPT came on the scene, researchers have been testing it for accuracy. Earlier this month, researchers from Long Island University presented a study that evaluated how accurate the free version of ChatGPT was when asked questions about certain medications.

The result? Almost 75% of the time, ChatGPT’s responses were wrong. And if that’s not bad enough, the AI program tried to back up their answers with references that don’t exist. According to a press release about the study, “When asked to cite references, the artificial intelligence program also generated fake citations to support some responses.”

In the AI world, they call this misrepresentation of the facts, “hallucinations.” But if the average person reads a response with references, they won’t usually go doublecheck them – they’ll usually believe them because they look and sound authentic.

The questions were not complicated

One of the questions was about Paxlovid. It’s an antiviral medication given to people who have COVID-19 and are at high risk of developing serious illness and complications. There are several reasons why someone may be at higher risk for severe COVID, such as having high blood pressure (hypertension). One common medication used to treat high blood pressure is verapamil

When the researchers asked ChatGPT if there were any problems with taking verapamil and Paxlovid together, the answer was no – no problem. But that was wrong. “In reality, these medications have the potential to interact with one another, and combined use may result in excessive lowering of blood pressure…Without knowledge of this interaction, a patient may suffer from an unwanted and preventable side effect,” the researchers wrote. Low blood pressure can result in fainting/passing out. 

I guess that most people who use ChatGPT casually use the free version. We wanted to see if there was a difference if we asked a paid version (I have 3.5). I asked it: “Are there any drug interactions with Paxlovid?

This was the response I received:

I don’t have real-time information, and my knowledge is based on information available up to January 2022. At that time, Paxlovid (nirmatrelvir/ritonavir) was an investigational antiviral medication used to treat COVID-19. It’s crucial to note that drug interactions can be complex and may change over time as new information becomes available.

For the most accurate and up-to-date information on drug interactions with Paxlovid or any other medication, it’s essential to consult with a healthcare professional or pharmacist. They can provide personalized advice based on your medical history, current medications, and specific health conditions.

If you have questions or concerns about a particular medication, including Paxlovid, contact your healthcare provider for the most reliable and current information.

I think this is promising because it didn’t tell us any falsehoods, admitting that we need to ask the professionals this question.

Other studies targeting ChatGPT

There are other studies that looked at AI accuracy in answering medically related questions. One published in August looked at ChatGPT 3.5 responses to questions about cancer treatments. As with the medication questions, there were many incorrect answers and made-up references.

In yet another study that asked broader questions, the authors wrote, “This study found that more than two-thirds of the references provided by ChatGPT 3 to a diverse set of medical questions were fabricated, although most seemed deceptively real. Moreover, domain experts identified major factual errors in a quarter of the responses. These findings are alarming, given that trustworthiness is a pillar of scientific communication.”

So how do you get medical information?

We need to have information to help ourselves but we can’t always ask the professionals. It can be hard enough to get an appointment for something serious, let alone for something that may be something you can handle at home. But you need to know where to look and how to evaluate if the information is accurate. 

I wrote about this in a blog post earlier this year, Is Dr. Google Reliable?” In that post, you’ll find information on what types of sites to look at and how to evaluate if it is trying to be helpful or trying to sell you something. But your best bet is a real person whenever possible. Have a medication question? Your pharmacist is your best resource. I’d even say that pharmacists are the most accessible front-line healthcare professionals. Don’t rely wholly on ChatGPT or any other form of AI. You can ask about  or look up your symptoms, but don’t rely on the computer’s feedback for a diagnosis and treatment. Be open with your healthcare provider and mention that you looked up your symptoms. Don’t tell your provider your diagnosis, as per AI, but share what you learned. 

Every person’s body and medical history is different. You deserve answers that help you, not that can hurt you.

 

Disclaimer

The information in this blog is provided as an information and educational resource only. It is not to be used or relied upon for diagnostic or treatment purposes.

The blog does not represent or guarantee that its information is applicable to a specific patient’s care or treatment. The educational content in this blog is not to be interpreted as medical advice from any of the authors or contributors. It is not to be used as a substitute for treatment or advice from a practicing physician or other healthcare professional.