General

ChatGPT for Health & Science Questions: Reliable or Not?

1 month ago (edited)

Do you trust ChatGPT or other AI tools for health and science questions? I've played around a lot also uploading lab results. ChatGPT is giving super plausible answers, but it seems a slippery slope if you rely entirely on it. I've noticed that I'm no longer using Google as a first step. What's your take?

AI

Please sign in to post a reply.

· 1 month ago (edited)

Great question, @simon

Same here, I usually start my research with ChatGPT or Perplexity. They’re super convenient for getting a quick overview.

When I want to go deeper, I switch to sources like examine.com, scientific papers, or peer-reviewed studies.

@heiko-bartlog actually explored this in detail in his excellent piece on how AI helped him uncover a micronutrient deficiency:

https://newzapiens.com/magazine/from-bloodwork-to-dna-to-ai-and-back-my-journey-to-solve-a-micronutrient-mystery


Curious to hear from you, @sandra-hagen and @jill-heitmann, do you use ChatGPT for scientific research or when digging into health topics? Would love to know how you approach it.

· 1 month ago

@karol I use both, but for scientific questions usually I will check ALL references (for the details that are relevant to me).

And indeed, in about 1/3 of cases these are incorrectly cited, sometimes even in the opposite direction (!). Or they don't contain any information about the discussed matter at all.

So, my personal recommendation would be to be very careful with the LLMs and always be willing to do your own research...

· 1 month ago

If you use a good model (e.g. o3 for medical stuff) and look at the sources chatgpt is using - I think it’s perfectly fine to mostly rely on these models. Obviously better to check in with your doctor as well (if you have a good one) but generally it seems to give proper scientific based responses

· 1 month ago

As with many other use cases, AI is perfectly fine getting 80% of the job done, but good luck getting it to fill in the missing 20%.

I tried ChatGPT for various use cases, like optimising supplement stacks, analysing lab tests, and creating a treatment plan, but it just cited wrong sources way too many times. I now default to Perplexity when I am looking for facts and ChatGPT for brainstorming or just a different angle on a problem. Can’t speak for Claude or Gemini.

· 4 weeks ago

I wouldn't trust ChatGPT with a medical research, especially with medications. As a health writer, I just use the tool for general research, then published expert citations for authenticity and fact checking.

· 1 week ago

I don't trust ChatGPT completely for nutrition-related information. As a dietitian and nutritionist, I see that there is a lot of misinformation out there that should not be taken as truth. It is best to verify information in scientific articles with sound methodology and on reliable websites, which are few and far between.

· 1 day ago

I use AI a lot — but with strict guidelines. I feed it trusted sources, a clear supplement selection framework, and client data (DNA, Omega Balance, cortisol, labs, intake forms) to create tailored, citation-backed guides. I review and check everything — and if something feels off, I dig deeper.

This way clients get results they can feel right away, while building long-term health. It cuts the misinformation, avoids one-size-fits-all, and stops the guessing game for good.