AI Gives Medical Advice — Survey Says..

To determine if people trust AI for medical and health advice, we surveyed 1,015 participants of various generational and racial backgrounds.
Published by Chris Riley on February 10, 2024

Many of us rely on the internet for information. Whether we want to know how to keep succulents alive, the official Scrabble rules, or what some mysterious rash is, the web is our go-to source.

However, are we willing to put the same level of trust in artificial intelligence (AI) to give out health-related information? If we do entrust our medical and health needs to AI, can we be sure that the diagnosis and protocols dispensed are accurate?

To determine if people trust AI for medical and health advice, we surveyed 1,015 participants of various generational and racial backgrounds.

Our findings reveal what percentage of them prefer virtual assistance to in-person appointments, what participant concerns were, and if particular symptoms warranted contacting a medical professional.

Additionally, we used GPT-3, a revolutionary human language content generator developed by OpenAI, to generate different types of medical and health advice, then asked medical professionals to rate the output. Read on to find out if there is a strong future for AI-generated medical and health services.

Paging the AI-Doctor

Sixty-five percent of our survey respondents attended a virtual medical appointment within the past year.

This is likely because virtual health services allow patients to seek medical and health advice without seeing a professional in person, which became much more challenging in 2020 due to the COVID-19 pandemic.

However, 62% of people preferred in-person appointments, while 21% favored virtual. Gen Xers (26%) and millennials (21%) showed a higher preference for virtual appointments than other generations.

When asked about their preferences for consulting AI for medical and health advice, Gen Z respondents (29%) were the most interested.

Still, most participants reported a moderate interest, which is good because expert predictions show that telemedicine may have a future beyond the pandemic.

Worries When Seeking Medical Help

Although medical professionals are trained to assist patients, their authoritative position often elicits pre-appointment anxiety.

Some even worry that doctors will mistreat or ignore their needs because of racial bias.

Whether it is identity- or position-based, such stress creates an environment where patients may not feel comfortable asking questions or accepting medical professionals’ advice.

According to our survey, participants were most concerned that they would be unable to ask follow-up questions during in-person visits with a doctor.

Fifty-one percent of Blacks or African Americans in our study shared this concern, compared to Asians or Pacific Islanders (35%), Hispanics or Latinos (30%), and whites or Caucasians (25%).

Perhaps respondents felt this way because most doctors spend just 17–24 minutes with each patient, giving people very little one-on-one time to understand their diagnosis and protocols — let alone build a relationship.

Unfortunately, AI does not seem to be the solution for this problem: Not being able to ask follow-up questions was also the primary concern for people when asked about receiving medical advice from AI. Sixty-five percent of African Americans expressed it, followed by 61% of Asians.

Other top concerns for in-person and AI visits were being provided with inaccurate information, receiving impersonal information (more of a worry with AI), and the lack of confidentiality.

Using the Internet for Self-Diagnosis

Recent opinions encourage people to self-diagnose using the internet because the more it is done, the better and more accurate self-diagnosis becomes. However, our findings show that about one-third of people do not always check the authority of their sources, which could be problematic.

Overall, 70% of survey participants reported that they research their symptoms online when they have a health concern. Only 36% expressed being “very” confident in the medical information they find online. The majority were moderately confident.

According to our findings, such a high percentage of internet self-diagnosis may not put human doctors out of business quite yet. If exposed to COVID-19, most participants said they would contact a medical professional first — before seeking assistance from Dr. Web. However, for all other medical woes, the overwhelming majority would first prioritize an online search.

Dr. GPT-3’s Reliability

As part of our research, we prompted the GPT-3 with 13 symptoms, and it generated medical and health advice based on those ailments. From there, we brought in medical professionals, such as physicians, nurses, and general practitioners, and asked them to rate GPT-3’s medical advice on a 5-point scale, with 5 being most accurate, comprehensive, effective, and dangerous.

When it came to prescribing advice for the chills and/or fever, AI said:

“If the temperature goes above 104°F (40°C), you need to be seen immediately. This is a medical emergency because there is danger of brain damage if the temperature stays high for more than twenty-four hours.”

When scoring this advice, health professionals found it incorrect and misleading. But the most dangerous and less effective advice was actually in regards to COVID-19, which received one of the highest ratings in terms of harmfulness on a scale of 1 through 5. When asked the best treatment for COVID-19, AI responded:

“See a doctor immediately. Make sure you tell your doctor about any medication you are taking, and especially if you take any other erectile dysfunction drugs such as ViagraCialis, Levitra or Stendra. These drugs work in a similar way to Covid-19 symptoms but may cause serious side effects when taken with Covid-19 symptoms.”

The first half of the advice directly contradicted medical professionals, and the second half of the advice was completely incorrect.

AI’s advice surrounding solutions for gas or acid reflux were the most accurate according to health professionals. In response to gas or acid reflux, AI prescribed,

“Keep the head of your bed elevated at least six inches (15 cm) by using a wedge or other elevating device. Do not eat spicy foods or large meals late in the evening, as these will aggravate symptoms.”

One of AI’s notably strange responses was for stomachaches, causing it to receive poor reviews from medical professionals. In treating a stomachache, AI recommended,

“Take an antacid, do not eat for a few hours, drink lots of water and go to bed. In a world where everyone is so quick to give medical advice, it’s worth noting that sometimes the best medicine is to not say anything at all.”

The oddly phrased response was unhelpful advice that probably should not be given to anyone experiencing pain or symptoms of illness.

Dr. GPT-3 was most accurate (4.04), effective (3.73), and comprehensive (3.52) when doling out advice about gas and acid reflux. The advice given for this condition was also rated least dangerous. Following gas and acid reflux, the GPT-3 shared accurate (3.90) and effective (3.64) advice for itchy eyes and throat and comprehensive information about acne, rating a 3.51.

Overall, according to medical professionals, Dr. GPT-3 was the least reliable when it gave medical advice for COVID-19 symptoms. Advice for this condition rated the highest for danger (3.28) and the lowest for accuracy (1.92). Consequently, it is a good thing that participants in our survey claimed they would consult a medical professional before the internet after possible COVID-19 exposure.

Real Pros React to GPT-3 Medical Advice

After we fed the GPT-3 symptoms and generated medical and health advice, we ran its output by actual medical professionals to determine accuracy, comprehensiveness, and effectiveness. While GPT-3 performed well in some areas, it seems AI technology may need a few more years of medical school before it can practice medicine.

Below is what some of the medical professionals in our study had to say about the medical advice GPT-3 provided. As you see, the medical advice GPT-3 shared when prompted with COVID-19 symptoms was “baseless” and “dangerous,” but there were pros when it came to the advice it shared for symptoms such as abdominal pain and itchy throat and eyes.

Is There a Future for AI Medical Assisting?

There is a future in using AI technology to diagnose and care for patients.

In some cases, telemedicine removes barriers for medical professionals and those they care for. However, it seems that GPT-3 remains in the early stages of doling out accurate, effective, and comprehensive medical advice.

Still, most people are willing to put a moderate level of trust in AI technology when it comes to receiving medical advice, which suggests there is a growing market for AI medical assistance.

Methodology and Limitations

For our consumer survey, we collected 1,015 responses of Americans from Amazon Mechanical Turk.

49% of our participants identified as men, about 51% identified as women, and less than 1% identified as nonbinary or nonconforming.

Participants ranged in age from 19 to 80 with a mean of 40 and a standard deviation of 12.1. Those who failed an attention-check question were disqualified.

The sample size for Asians and Pacific Islanders was 103 people. For Black and African Amercians it was 99 people, and for Hispanic and Latinos it was 60 people. It is possible that with more of these participants, we could have gained a more accurate insight into these populations.

The data we are presenting rely on self-report.

There are many issues with self-reported data. These issues include, but are not limited to, the following: selective memory, telescoping, attribution, and exaggeration.

No statistical testing or weighting was performed, so the claims listed above are based on means alone. As such, this content is purely exploratory, and future research should approach this topic in a more rigorous way.

For our survey of 77 health care professionals, 32% reported having completed some time at college or held an associate degree, 31% held a bachelor’s degree, and 37% held a master’s or doctorate degree.

Occupations included a range of job titles from general practitioners and clinicians to home health aides and registered nurses.

GPT-3 was prompted to produce output for each symptom using identical prompts.

This output was lightly edited for length and repetition, but not for content, fact-checking, or grammar. The findings in this article are limited by small sample batches and are for exploratory purposes only, and future research on the capabilities of AI should approach this topic in a more rigorous way.

Fair Use Statement

Would you like to debate the topic of using AI for medical purposes with friends and family? We encourage you to share the results of this study for any noncommercial use. Please be sure to link back to this page so that readers have full access to our methodology and findings and so contributors are credited for their work.

Appendix and additional graphs.

How we built this article:

  • Content Process
At USARx.com, we are committed to delivering trustworthy, accessible, and precise information, enabling you to manage your health effectively. Our rigorous Editorial Process guarantees that we provide the highest quality information possible.

Developed by the USARx.com team, our Editorial Process serves as the foundation of all our endeavors. What exactly does this process entail? How do we ensure that our publications meet our exacting standards?

Every piece of content we produce is meticulously crafted and edited based on the four core pillars of our editorial philosophy: (1) building and sustaining trust; (2) upholding the highest journalistic standards; (3) prioritizing accuracy, empathy, and inclusivity; and (4) continuously monitoring and updating our content. These principles ensure that you consistently receive timely, evidence-based information.


    1. Building and Sustaining Trust Navigating health information can often be overwhelming and confusing. At USARx.com, we aim to transform that experience by providing content that is not only trustworthy and accurate but also clear, understandable, and actionable. Our content addresses overall well-being and assists you in making crucial connections between health practices and lifestyle choices, a concept we refer to as “whole person health.” We cover a diverse array of topics and perspectives openly and objectively.


    2. Upholding the Highest Journalistic Standards You rely on USARx.com for precise and factual health information, placing a significant responsibility on us to maintain high journalistic standards. Our content is unbiased, balanced, timely, actionable, and grounded in research. We select our contributors carefully and ensure they are trained in research best practices, and we regularly review their work to provide ongoing feedback and coaching.


    3. Prioritizing Accuracy, Empathy, and Inclusivity Our commitment is to make sure all content we publish is easy to understand and accessible to everyone. We use a proprietary style guide based on the Associated Press Stylebook to ensure clarity, empathy, inclusivity, and practical application in our writing. We strive to use language that is warm and inviting yet bold and progressive, promoting inclusivity and compassion without bias.


    4. Continuous Monitoring and Updating of Content Health information evolves quickly with new research findings, changes in medical standards, and shifts in terminology. At USARx.com, we are dedicated to continuously updating our content to reflect the most current and accurate information. Our Medical Integrity team plays a crucial role in keeping our content up to date with the latest standards of care and medical practices.


Our content carries various dates indicating when it was written, medically reviewed, fact-checked, and last updated, reflecting our commitment to accuracy and reliability. We encourage feedback and take immediate action to correct any inaccuracies or outdated information, ensuring our content remains relevant and valuable to our readers.

At USARx.com, you are our primary focus. Our Editorial Process is designed with your health journey in mind, aiming to be your trusted ally in achieving and maintaining optimal health. We invite your feedback to continually enhance the quality and relevance of our content.

We are committed to providing our readers with only trusted resources and science-based studies with regards to medication and health information. 

Disclaimer: This general information is not intended to diagnose any medical condition or to replace your healthcare professional. If you suspect medical problems or need medical help or advice, please talk with your healthcare professional.

Facebook
Twitter
LinkedIn

Share This Post

Popular Destinations

Recent Articles

Perceptions of Marijuana and Psychedelics

Let’s take a closer look at how many Americans report using different drugs (including marijuana, LSD, ketamine, and PCP); which drugs they support either recreational and/or therapeutic use of; how important medical research is to their opinions; and how many believe marijuana is a better treatment option than other medications.

Read More »

Vaccine Passports

We’ve surveyed over 1,000 people to get their takes on this latest political controversy surrounding the pandemic. Respondents were asked if they were for or against the concept of vaccine passports; reflected on some ethical issues that might arise when they are rolled out; and discussed reasons why they may or may not get vaccinated in general. We’ve used variables such as age and political affiliation to help flesh out and expand on the findings. Read on to find out more!

Read More »

Share On: