Can I Use AI to Manage My Medical Practice Am I Liable if It Makes a Mistake

State and federal law imposes significant duties on physicians to responsibly manage and maintain their medical practices, including extensive rules about how to maintain patient records. The Texas Medical Board (TMB) also has established ethical rules governing physicians’ management of their practices. In addition, the recent rise in the use of artificial intelligence (AI) to manage office operations and perform other routine tasks has given physicians a new tool that promises innovative increases in efficiency and effectiveness for medical practices. 

However, as AI does require some level of human direction, mistakes can occur. Failing to follow the laws or rules concerning proper medical practice management can lead to patient complaints and disciplinary action by TMB. Disciplinary proceedings can have severe consequences and long-lasting implications for physicians’ medical licenses and careers. If you face disciplinary action for using AI in your medical office, contact a Texas medical license defense lawyer for assistance. Together, we will strive to protect your license and determine the best strategy for your case.

The Potential to Incorporate AI into Your Medical Practice

It is no secret that physicians and their staff must complete extensive daily paperwork to document patient interactions by phone, online, or in person. These tasks can take significant time, which only detracts from their time with patients. AI is one-way physicians could eliminate this burden and spend more time interacting with patients instead of completing paperwork.

For example, one company called Glass Health has invented a program called “Glass AI” based on ChatGPT. The doctor tells the Glass AI chatbot about the patient, which suggests a list of possible diagnoses and a treatment plan. The chatbot uses a virtual medical book written by humans as its source of facts rather than raw data that it finds on its own. Information supplied and vetted by humans is a safeguard that makes the outcome more dependable and accurate. Theoretically, the doctor could put in a single summary lien for a patient, and the chatbot could create a draft of a clinical plan for that doctor.

Another potential use for AI in managing the paperwork in medical offices is a chatbot that would listen to a conversation between the doctor and patient in real time and provide a written summary of the conversation in the form of a report. No recording is created, and the patient is advised in advance that AI is being used. The doctor can make any adjustments, validate the report with a click of a button, and upload it as part of the patient’s medical records to save invaluable time. 

The Downsides of AI in Your Medical Practice

Despite these advances, AI is imperfect and, in some cases, operates poorly. For instance, if the data that the chatbot relies on is faulty, then the output also will be faulty. In other cases, chatbots fabricate answers with no bearing on the subject. For example, one chatbot that suggested a cure for a patient’s depression was to recycle electronics. 

Another major issue is that AI carries over implicit and explicit biases that already exist in society. AI will then reflect these biases on the doctor using the system. For example, an AI trained on scientific papers and medical notes that was told “White or Caucasian patient was belligerent or violent” responded with “Patient was sent to hospital.” On the other hand, when told “Black or African American patient was belligerent or violent,” the program responded with, “Patient was sent to prison.” This chatbot is not objective regarding race, which can reflect poorly on the doctor. 

Steps to Take to Avoid Problems When Using AI in Your Medical Practice

First, the best way to avoid any errors or biases that AI might exhibit in creating or maintaining patient records is to have a human review them. Without reviewing those documents, you cannot rely on AI to create patient records, diagnose a patient, or create a treatment plan. While it will save time to use AI to generate drafts of documents, they are just that – drafts. None of these drafts should ever serve as final documents and always should be subject to review by humans before they are finalized, no matter how foolproof your AI program may seem. 

Likewise, you must watch out for bias in your AI program and your practice. This step goes beyond recognizing and correcting perceived biases in documents generated by AI programs. For example, are you treating different patient groups with the same symptoms or diagnoses differently? If so, then you may need to adjust your own biases, as well. 

These steps are important because you are liable for the documents that your office produces, including treatment plans and diagnoses. In addition, if your staff member makes a mistake, you are liable if your AI program makes a mistake. Therefore, you will face disciplinary action and potential liability for medical malpractice if something goes wrong with your AI program and you do not catch the error in time. 

Click to contact our professional license defense lawyers today

Fight Back Against Disciplinary Proceedings Before the TMB

Do not let allegations about your medical practice management techniques distract you from your true calling – to deliver quality healthcare services to patients. If you are facing adverse action concerning your professional license, we can help you take the steps necessary to defend yourself in your disciplinary proceedings. Contact a medical license defense attorney at Bertolino LLP, for advice today. Make an appointment by calling (512) 515-9518 or contact us online to see how we can help.

Call or text (512) 476-5757 or complete a Case Evaluation form