Will Users Share More Personal Health Data With ChatGPT Health?
With healthcare one of the most common topics people ask ChatGPT about, OpenAI is introducing ChatGPT Health to allow users to connect medical records and wellness apps such as Apple Health, Function, and MyFitnessPal. The idea is that ChatGPT will have the context of information such as recent test results to help users prepare for appointments with their doctor, get advice on how to approach their diet and workout routine, or even understand the tradeoffs of different insurance options based on their healthcare patterns.
The company stressed that ChatGPT Health is not intended for diagnosis or treatment. Instead, it is designed to help users navigate everyday questions and understand patterns over time—not just moments of illness.
Obviously users will have concerned about how their data is used. OpenAI says that “to keep your health information protected and secure, Health operates as a separate space with enhanced privacy to protect sensitive data. Conversations in Health are not used to train our foundation models. If you start a health-related conversation in ChatGPT, we’ll suggest moving into Health for these additional protections.”
In a statement, Andrew Crawford, senior counsel with the Center for Democracy & Technology’s Data and Privacy Project, stressed the lack of regulatory oversight of how data is used by companies not bound by HIPAA.
"New AI health tools offer the promise of empowering patients and promoting better health outcomes, but health data is some of the most sensitive information people can share and it must be protected,” Crawford said.
"The U.S. doesn’t have a general-purpose privacy law, and HIPAA only protects data held by certain people like healthcare providers and insurance companies. AI companies, along with the developers and companies behind health and apps, are typically not covered by HIPAA. The recent announcement by OpenAI introducing ChatGPT Health means that a number of companies not bound by HIPAA’s privacy protections will be collecting, sharing, and using peoples’ health data. And since it's up to each company to set the rules for how health data is collected, used, shared, and stored, inadequate data protections and policies can put sensitive health information in real danger.
"While OpenAI says that it won’t use information shared with ChatGPT Health in other chats, AI companies are leaning hard into personalization as a value proposition. Especially as OpenAI moves to explore advertising as a business model, it’s crucial that separation between this sort of health data and memories that ChatGPT captures from other conversations is airtight.”
In a statement, Emarketer healthcare analyst Rajiv Leventhal explained why OpenAI may have an advantage over previous attempts by large tech companies to get into this space.
”The launch of ChatGPT Health is significant since it marks OpenAI’s shift from positioning ChatGPT as a resourceful health information tool to one that encourages users to upload sensitive medical record data. Other tech giants (most notably Apple with Health Records) have developed products that link consumer health apps to personal data, but efforts have largely stalled due to fragmented access to patient records and privacy concerns. ChatGPT Health could be different because it has a built-in user base of 800 million global weekly active users, 25% of whom submit a prompt about healthcare each week. Lots of these folks use AI for quick answers to their health questions on symptoms and medical conditions—but whether they’re willing to share more personal health data will determine ChatGPT Health’s success."
About the Author

David Raths
David Raths is a Contributing Senior Editor for Healthcare Innovation, focusing on clinical informatics, learning health systems and value-based care transformation. He has been interviewing health system CIOs and CMIOs since 2006.
Follow him on Twitter @DavidRaths
