ChatGPT Health: Secure Medical Conversations with Encrypted Data Protection

Key Takeaways

  • OpenAI has launched ChatGPT Health, a dedicated space for users to discuss health-related topics with the chatbot
  • The feature allows users to connect medical records and wellness apps to receive personalized advice and insights
  • ChatGPT Health is available for users with ChatGPT Free, Go, Plus, and Pro plans outside of the European Economic Area, Switzerland, and the U.K.
  • The feature has enhanced privacy and security features, including purpose-built encryption and isolation to protect sensitive health data
  • OpenAI emphasizes that ChatGPT Health is designed to support medical care, not replace it or be used as a substitute for diagnosis or treatment

Introduction to ChatGPT Health
OpenAI, a leading artificial intelligence company, has announced the launch of ChatGPT Health, a dedicated space that enables users to have conversations with the chatbot about their health. This new feature allows users to securely connect their medical records and wellness apps, including Apple Health, Function, MyFitnessPal, Weight Watchers, AllTrails, Instacart, and Peloton, to receive tailored responses, lab test insights, nutrition advice, personalized meal ideas, and suggested workout classes. As OpenAI stated, "ChatGPT Health builds on the strong privacy, security, and data controls across ChatGPT with additional, layered protections designed specifically for health — including purpose-built encryption and isolation to keep health conversations protected and compartmentalized."

Privacy and Security Features
The company has emphasized the importance of privacy and security in the development of ChatGPT Health. The feature operates in a silo with enhanced privacy and its own memory to safeguard sensitive data using "purpose-built" encryption and isolation. Conversations in Health are not used to train OpenAI’s foundation models, and users who attempt to have a health-related conversation in ChatGPT are prompted to switch over to Health for additional protections. As OpenAI highlighted, "Health information and memories is not used to contextualize non-Health chats," and "conversations outside of Health cannot access files, conversations, or memories created within Health." Furthermore, apps can only connect with users’ health data with their explicit permission, even if they’re already connected to ChatGPT for conversations outside of Health.

Evaluation and Clinical Standards
OpenAI has evaluated the model that powers Health against clinical standards using HealthBench, a benchmark the company revealed in May 2025. This evaluation-driven approach helps ensure the model performs well on tasks people actually need help with, including explaining lab results in accessible language, preparing questions for an appointment, interpreting data from wearables and wellness apps, and summarizing care instructions. As OpenAI added, "This evaluation-driven approach helps ensure the model performs well on the tasks people actually need help with, including explaining lab results in accessible language, preparing questions for an appointment, interpreting data from wearables and wellness apps, and summarizing care instructions." The company’s commitment to clinical standards and evaluation is a crucial aspect of ChatGPT Health, as it aims to provide accurate and reliable information to users.

Limitations and Controversies
While ChatGPT Health has the potential to revolutionize the way people access health information, it is essential to note that it is not a substitute for medical care. OpenAI emphasizes that the tool is designed to support medical care, not replace it or be used as a substitute for diagnosis or treatment. However, the company’s announcement comes at a time when AI-powered health tools are facing scrutiny. An investigation by The Guardian found that Google AI Overviews were providing false and misleading health information. OpenAI and Character.AI are also facing several lawsuits claiming their tools drove people to suicide and harmful delusions after confiding in the chatbot. A report published by SFGate earlier this week detailed how a 19-year-old died of a drug overdose after trusting ChatGPT for medical advice. These incidents highlight the need for caution and responsible development of AI-powered health tools.

Conclusion
In conclusion, ChatGPT Health is a significant development in the field of AI-powered health tools. With its enhanced privacy and security features, evaluation against clinical standards, and commitment to supporting medical care, it has the potential to provide accurate and reliable information to users. However, it is crucial to remember that ChatGPT Health is not a substitute for medical care, and users should always consult with a healthcare professional for diagnosis and treatment. As OpenAI continues to develop and improve ChatGPT Health, it is essential to prioritize responsible development, evaluation, and transparency to ensure that the tool benefits users while minimizing potential risks.

https://thehackernews.com/2026/01/openai-launches-chatgpt-health-with.html

Click Spread

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top