ChatGPT & HIPAA: A Quick Guide on What You Need to Know

April 26, 2023
Artificial Intelligence

If you haven’t heard about ChatGPT over the last few months, you might still be Googling everything!  ChatGPT launched in November 2022 and has taken the internet by storm.  Developed by OpenAI, using artificial intelligence (AI) technology, it can have human-like conversations while giving you all the details of whatever you may ask it.

So we haven’t seen it be able to make you dinner just yet. Still, it has successfully written computer programming, passed a series of different exams, and written entire feature-length articles. (Wow, I feel like a doting parent!) AI language models are transforming how we approach everyday tasks or complete major projects, and the healthcare industry has even jumped on board the ChatGPT train. 

ChatGPT has assisted in scheduling appointments, treatment plan assistance, patient education, medical coding, and more!  While this all sounds exciting and has the opportunity to improve patient care, protecting your patient’s data when using these types of tools will be imperative and should be approached with caution. 

So what are some of the red flags to be aware of when it comes to HIPAA compliance:

• At this time, OpenAI does not sign a Business Associate Agreement. Therefore, it is not HIPAA compliant.

HIPAA regulations require that covered entities only share PHI with vendors who have signed a BAA. This ensures that PHI is protected and that all parties comply with HIPAA laws and regulations. Prior to implementing any AI technology that processes or accesses PHI, covered entities must enter into a business associate agreement with the vendor of such technology. 

• Protect PHI when using the chat platform.

OpenAI warns against inputting confidential information into the platform. As with many technology platforms, ChatGPT collects information and reviews conversations to improve systems and services. In other words, there is no telling where that data is being stored and, therefore, cannot be protected. Because this platform is not HIPAA compliant, it’s super important to remember not to input any identifiable patient information. When working with PHI, de-identifying or anonymizing data is key to minimizing the risk of a data breach.

• Establish access controls and monitor chat logs.

To minimize risk, access to chat logs should be restricted to those who need it as part of their job function. Don’t forget to implement written documentation of which employees can access chat logs, and be sure to revoke access if necessary. These chat logs are highly recommended to be monitored and audited to ensure they do not contain any PHI.

• Establish Policies and Procedures and train employees.

When implementing a new technology, such as ChatGPT, that potentially accesses PHI, policies, and procedures must be implemented to ensure that all appropriate safeguards are in place to support the use of the new technology. Training employees on properly using new technology is also super important. Training should include security best practices, data privacy importance, and incident reporting steps if necessary.

• Create and implement an incident response policy.

As with any security risk, having an incident response policy is super important to help mitigate risk in the case of a breach. This plan should include procedures for identifying and mitigating the incident, notifying affected individuals, and investigating the cause of the incident to prevent future incidents. 

By proactively prioritizing patient privacy and security, healthcare organizations can greatly benefit from ChatGPT and other AI language models. Streamlining administrative work and improving patient outcomes, sounds like a win-win. But, it’s critical that you carefully balance increased efficiency and elevated risks related to patient data privacy. This is new for everyone, so not making drastic changes to your business because of something ChatGPT can do should be considered. Your patients still want human experiences, and that is something ChatGPT can’t take away from you and your staff!
How can you stay up to date on the latest compliance trends and news? Contact our compliance experts at Abyde today for guidance on this everchanging technical landscape and see how we can help you be successful in the years ahead! 

To book a demo with one of our Abyde specialists, click here or call us at (800) 594-0883