Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

General Discussion

Showing Original Post only (View all)

highplainsdem

(60,102 posts)
Thu Jan 8, 2026, 01:17 PM Thursday

OpenAI launches ChatGPT Health, encouraging users to connect their medical records (The Verge) + Bluesky reactions [View all]

https://www.theverge.com/ai-artificial-intelligence/857640/openai-launches-chatgpt-health-connect-medical-records

OpenAI launches ChatGPT Health, encouraging users to connect their medical records
But it’s ‘not intended for diagnosis or treatment.’

by Hayden Field
Jan 7, 2026, 1:00 PM CST


OpenAI has been dropping hints this week about AI’s role as a “healthcare ally” — and today, the company is announcing a product to go along with that idea: ChatGPT Health.

ChatGPT Health is a sandboxed tab within ChatGPT that’s designed for users to ask their health-related questions in what it describes as a more secure and personalized environment, with a separate chat history and memory feature than the rest of ChatGPT. The company is encouraging users to connect their personal medical records and wellness apps, such as Apple Health, Peloton, MyFitnessPal, Weight Watchers, and Function, “to get more personalized, grounded responses to their questions.” It suggests connecting medical records so that ChatGPT can analyze lab results, visit summaries, and clinical history; MyFitnessPal and Weight Watchers for food guidance; Apple Health for health and fitness data, including “movement, sleep, and activity patterns”; and Function for insights into lab tests.

On the medical records front, OpenAI says it’s partnered with b.well, which will provide back-end integration for users to upload their medical records, since the company works with about 2.2 million providers. For now, ChatGPT Health requires users to sign up for a waitlist to request access, as it’s starting with a beta group of early users, but the product will roll out gradually to all users regardless of subscription tier.

The company makes sure to mention in the blog post that ChatGPT Health is “not intended for diagnosis or treatment,” but it can’t fully control how people end up using AI when they leave the chat. By the company’s own admission, in underserved rural communities, users send nearly 600,000 healthcare-related messages weekly, on average, and seven in 10 healthcare conversations in ChatGPT “happen outside of normal clinic hours.” In August, physicians published a report on a case of a man being hospitalized for weeks with an 18th-century medical condition after taking ChatGPT’s alleged advice to replace salt in his diet with sodium bromide. Google’s AI Overview made headlines for weeks after its launch over dangerous advice, such as putting glue on pizza, and a recent investigation by The Guardian found that dangerous health advice has continued, with false advice for liver function tests, women’s cancer tests, and recommended diets for those with pancreatic cancer.

-snip



Ran across this late last night while catching up with Bluesky posts. First saw it mentionrd by science fiction writer John Scalzi, commenting on The Verge's Bluesky post about it. Scalzi's comment: "Today in Oh Hell No"

Some comments from other people in Scalzi's thread:

Oh, look, it's the long-awaited GOP health care plan. Give up your privacy and hope that your ailment is the most frequent one matching your symptoms, then get told to use whatever quack medicine paid the highest fee to OpenAI.


Today in "desperately searching for a way to monetize this boondoggle"


‪Aside from the idea of 'we are taking all of your data', if you have direct personal communication from an seemly-authoritative human-sounding chatbot, people are going to take terrible advice from this whether or not the company line is 'not intended for diagnosis or treatment'.


‪It's the modern version of "I checked WebMD and apparently I have cancer" only this version will probably suggest you drink clorox.




EDITING to add some of the Bluesky comments on The Verge's post about this:

if it's not intended for diagnosis or treatment, it's intended to sell you things and sell healthcare product and pharma companies the opportunities to sell you things. for the umpteenth time, no x a million billion


‪I work in IS for a good sized health system. We’ve been reminded often never, ever, ever put patient data into ChatGPT or any other AI chatbot cause bad things will happen. My eye is twitching about private patient data being used to train LLM’s.


We don't think the "you should kill yourself" bot needs sensitive medical data


This is the dumbest AI related crap I have seen this week, and there has been a LOT of dumb AI related crap this week already.


‪they aren't your actual medical provider and therefore are not bound by HIPAA in any way


this should reduce the nation's health costs by reducing the number of people who need health
12 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Latest Discussions»General Discussion»OpenAI launches ChatGPT H...