In November 2023, OpenAI launched the long-awaited custom versions of ChatGPT. Custom GPTs are Generative Pre-trained Transformers that can be tailored to serve specific purposes by combining specialist instructions, extra knowledge, and expertise.
One of the most exciting applications of custom GPTs is in healthcare. But while there's no doubt that artificial intelligence (AI) can bring significant benefits to patients and healthcare providers, there are also serious concerns about the safety of sensitive patient data.
As we navigate this new frontier of custom GPTs in healthcare, we need to examine the potential benefits and risks associated with these technologies, particularly regarding patient data security and privacy.
Benefits and Risks of ChatGPT in Healthcare
ChatGPT and similar GPTs have shown remarkable potential in various healthcare applications. Let's explore the key aspects of their capabilities and the challenges they present.
Admin
AI is an invaluable tool to automate certain tasks, such as scheduling appointments, managing patient data entry, and processing insurance claims—freeing up time that medical practitioners can devote to patient-facing work.
To better understand the impact of ChatGPT on medical admin, we've asked ValueCoders IT Consultant Ved Raj to share his thoughts.
"The integration of ChatGPT in medical communication is revolutionizing patient interactions in healthcare. This powerful AI tool allows patients to have natural conversations with an intelligent assistant 24/7.”
However, Raj warns of risks associated with AI's growing role in healthcare, particularly the potential for data loss and its ethical implications as AI becomes more integrated into medicine:
"Data privacy and accuracy remain pressing challenges, and there are major ethical implications of having an AI make medical suggestions."
Rural Healthcare
In rural areas, where healthcare resources are often scarce, AI technologies like ChatGPT have the potential to make a big impact. These tools can help bridge the gap in access to medical expertise and enhance the overall efficiency and quality of care in underserved communities.
Executive Director at Nebraska Rural Health Association Jed R. Hansen elaborates on the potential of AI in rural healthcare:
"AI promises a fresh set of technologies in clinical care, revenue cycle optimization, and clinical decision augmentation that offer solutions to workforce shortages in clinical care and operations."
“Care teams that are able to readily assess and adopt appropriate AI applications will be poised to lead care in their respective communities into the foreseeable future.”
However, a 2024 research report on AI in rural healthcare development highlighted a significant challenge: the limited patient data available in rural facilities for AI training. This scarcity of diverse data can lead to AI systems that are potentially less accurate or biased when applied in rural healthcare settings.
Addiction Treatment
AI technologies like ChatGPT offer both opportunities and challenges in the field of addiction treatment. These tools can potentially streamline administrative tasks and enhance patient monitoring, but they also raise concerns about privacy and the need for human empathy in treatment.
Founder & CEO of PursueCare Nick Mercadante shares his perspective on AI in addiction treatment:
"Generative AI and similar tools help practices like PursueCare that treat substance use disorder by automating administrative tasks such as scheduling, certain aspects of documentation, and billing so that our staff can better focus on patient care."
However, Mercadante also notes concerns about GPT’s limitations and risks:
“AI chatbots cannot provide empathy or understand the human emotion of an individual patient, which is important in therapy for addiction. We are also concerned about bias and inequality, as AI can inherit biases from the data it is trained on, [and] about misuse of AI-driven tools that inadvertently share private health information without the patient’s consent.”
Navigating the Challenges of ChatGPT and Patient Data Privacy
While the integration of GPTs in healthcare offers significant benefits, it also raises important concerns about patient data privacy and security.
Patient uptake
What’s the point of AI if people don’t want it? Understanding patient attitudes towards AI in healthcare is step one in its successful implementation. Managing Editor at Xtelligent Healthcare Marketing Sara Heath shares data on patient attitudes toward GPTs in healthcare:
"The most recent survey data suggests that American patients are split as to whether they are comfortable with the use of artificial intelligence in their healthcare experience.”
“Despite 100% of US patient's healthcare journey involving AI either directly or indirectly, only 38% say they trust it.”
She goes on to acknowledge that patients feel they have insufficient information and that there is more they need to learn about the technology.
Vulnerability of Data
There's one clear thread running through our discussion of AI in healthcare: the critical importance of protecting patient data. The integrity and security of patient data are foundational to effective healthcare delivery.
While the risks are felt, AI Consultant and Founder of Clearlead AI Consulting Paul Ferguson shares a personal responsibility side to the story:
"GPTs don't inherently increase the vulnerability of patient data. The risk comes from improper use of these tools.”
“Any responsible healthcare organization should never use public GPTs like ChatGPT to process sensitive information."
HIPAA Compliance
When it comes to AI in healthcare, HIPAA isn't just a buzzword—it's a make-or-break factor. The question of HIPAA compliance is crucial when considering the use of ChatGPT or any GPT in healthcare settings. It's important to note that ChatGPT in its public form is not HIPAA compliant.
Healthcare organizations must implement private, controlled versions of such technologies to ensure compliance with data protection regulations.
GPTs Security: Mitigating Risks in Healthcare Systems
As we integrate GPTs into healthcare, it's clear that the potential benefits can only be realized with a strong commitment to data security and patient privacy.
"It's crucial to understand that while there are potential risks associated with GPTs in healthcare, there are well-understood solutions to address these problems,” Ferguson says.
According to him, to ensure compliance and maintain data security, healthcare providers need:
- Secure, on-premises or private cloud LLM solutions
- Regular privacy impact assessments
- Transparent data processing practices
- Data anonymization before any interaction with AI systems
- Continuous staff training on data handling and AI use
- Regular security audits and updates
- Clear protocols for AI system use and data access
- Staying up to date with the latest regulations
- Documentation of AI system use and decision-making processes
Ferguson also points out how much the integration of GPTs in healthcare is subject to rapidly evolving regulatory frameworks:
"The EU AI Act (which came into effect on August 1, 2024) imposes stringent restrictions on certain AI applications, including but not limited to Generative AI.”
“Healthcare providers integrating GPTs must navigate these new regulations carefully."
Embracing the Future of Healthcare with Confidence
Healthcare providers must implement robust security measures to protect sensitive patient information. One piece of this security strategy is Managed Endpoint Detection and Response (EDR) solutions specifically designed for healthcare environments.
Managed EDR for healthcare provides constant monitoring and response capabilities, ensuring that patient data remains protected against evolving threats. This proactive approach not only safeguards sensitive information but also builds trust with patients—paving the way for a more efficient, accurate, and innovative healthcare future.
Take the first step towards secure AI integration in your healthcare organization, book a demo to see how Huntress can help you.
Sign Up for Blog Updates
Subscribe today and you’ll be the first to know when new content hits the blog.