April 10, 2024 It’s hard not to marvel at the updates in technology. Maybe it’s not exactly what we expected from the Jetsons’, but it’s pretty close, especially with the recent push of Artificial Intelligence over the past two years. Artificial Intelligence, more commonly known as AI, is the technology that simulates human behavior and capabilities. AI has become much more accessible to the public and has transformed how we work. One of the most common AI platforms used is ChatGPT, a generative AI tool that can write anything in seconds – and definitely helps in the medical field. For example, ChatGPT can help with scheduling appointments, treatment plan assistance, patient education, and medical coding. But here’s the thing: With all this amazing AI tech floating around, we gotta make sure it’s used in compliance with HIPAA. We put together everything you need to know about using ChatGPT in a HIPAA-compliant way here! While more AI tools are revolutionizing healthcare, it raises a crucial question: how do we stay HIPAA compliant? Well, look no further! We’re blasting off into the future and giving everything you need to know when it comes to AI in healthcare. AI Companies + BAAs = BFFs These new healthcare AI companies would fall under Business Associates (BAs), if they have access to your patients’ Protected Health Information (PHI). With every BA, it’s required to have a Business Associate Agreement (BAA). BAAs are documents that establish the working relationship between a Covered Entity (CE) and a Business Associate, describing each party’s responsibilities when it comes to the protection of patients’ sensitive information. However, not all AI companies are willing to jump on the BAA bandwagon. By signing this agreement, they take on that shared responsibility when it comes to protecting PHI. For instance, Open AI currently does not sign BAAs for ChatGPT, so sharing ePHI with them would not be HIPAA compliant. However, some tech giants are willing to sign BAAs for their AI platforms. For instance, Google has made strides in healthcare AI tools and has a process to enter a BAA with them for certain services. Give it a Double Take While AI can level up your practice, ensure that you keep a watchful eye on what information AI is producing. We are still in the infancy stage of AI in healthcare, and it’s bound to make mistakes. Here’s your fun fact for the day. Did you know that when AI makes a mistake, it’s called a hallucination? Like how when we see things that aren’t there, the AI platform is ‘seeing’ patterns of information incorrectly, resulting in an inaccurate result. So, when using AI, make sure you always give it the once over, making sure it’s on the right track. What does the future of compliance look like? Well, we know for sure more legislation is coming out regarding Artificial Intelligence. With the rise of new technologies in healthcare, like online tracking, the Office For Civil Rights (OCR) will release new guidance. Artificial Intelligence is already on the radar for the government, with the Biden Administration unveiling an Executive Order on AI. Additionally, major healthcare organizations have committed to handling AI technology carefully, harnessing potential, while managing risks. What can I do? It’s a great, big beautiful tomorrow when it comes to the future of healthcare technology. We’re all along for the ride on the Carousel of Progress (Disney fans, anyone?). Staying on top of the latest compliance updates is key to remaining compliant. That’s how Abyde can help. We make compliance easy, making it the easiest part of running your practice or business. As technology continues to improve so should your compliance program. We turn the old binder in your practice or business into cloud-based software, making everything you need for compliance easily accessible. To learn more about current compliance legislation, email us at info@abyde.com and schedule a consultation here for Covered Entities, and here for Business Associates.
ChatGPT & HIPAA: A Quick Guide on What You Need to Know
April 26, 2023 If you haven’t heard about ChatGPT over the last few months, you might still be Googling everything! ChatGPT launched in November 2022 and has taken the internet by storm. Developed by OpenAI, using artificial intelligence (AI) technology, it can have human-like conversations while giving you all the details of whatever you may ask it. So we haven’t seen it be able to make you dinner just yet. Still, it has successfully written computer programming, passed a series of different exams, and written entire feature-length articles. (Wow, I feel like a doting parent!) AI language models are transforming how we approach everyday tasks or complete major projects, and the healthcare industry has even jumped on board the ChatGPT train. ChatGPT has assisted in scheduling appointments, treatment plan assistance, patient education, medical coding, and more! While this all sounds exciting and has the opportunity to improve patient care, protecting your patient’s data when using these types of tools will be imperative and should be approached with caution. So what are some of the red flags to be aware of when it comes to HIPAA compliance: • At this time, OpenAI does not sign a Business Associate Agreement. Therefore, it is not HIPAA compliant. HIPAA regulations require that covered entities only share PHI with vendors who have signed a BAA. This ensures that PHI is protected and that all parties comply with HIPAA laws and regulations. Prior to implementing any AI technology that processes or accesses PHI, covered entities must enter into a business associate agreement with the vendor of such technology. • Protect PHI when using the chat platform. OpenAI warns against inputting confidential information into the platform. As with many technology platforms, ChatGPT collects information and reviews conversations to improve systems and services. In other words, there is no telling where that data is being stored and, therefore, cannot be protected. Because this platform is not HIPAA compliant, it’s super important to remember not to input any identifiable patient information. When working with PHI, de-identifying or anonymizing data is key to minimizing the risk of a data breach. • Establish access controls and monitor chat logs. To minimize risk, access to chat logs should be restricted to those who need it as part of their job function. Don’t forget to implement written documentation of which employees can access chat logs, and be sure to revoke access if necessary. These chat logs are highly recommended to be monitored and audited to ensure they do not contain any PHI. • Establish Policies and Procedures and train employees. When implementing a new technology, such as ChatGPT, that potentially accesses PHI, policies, and procedures must be implemented to ensure that all appropriate safeguards are in place to support the use of the new technology. Training employees on properly using new technology is also super important. Training should include security best practices, data privacy importance, and incident reporting steps if necessary. • Create and implement an incident response policy. As with any security risk, having an incident response policy is super important to help mitigate risk in the case of a breach. This plan should include procedures for identifying and mitigating the incident, notifying affected individuals, and investigating the cause of the incident to prevent future incidents. By proactively prioritizing patient privacy and security, healthcare organizations can greatly benefit from ChatGPT and other AI language models. Streamlining administrative work and improving patient outcomes, sounds like a win-win. But, it’s critical that you carefully balance increased efficiency and elevated risks related to patient data privacy. This is new for everyone, so not making drastic changes to your business because of something ChatGPT can do should be considered. Your patients still want human experiences, and that is something ChatGPT can’t take away from you and your staff!How can you stay up to date on the latest compliance trends and news? Contact our compliance experts at Abyde today for guidance on this everchanging technical landscape and see how we can help you be successful in the years ahead! To book a demo with one of our Abyde specialists, click here or call us at (800) 594-0883