There is some argument that General Artificial Intelligence (GAI) has already been woven into the fabric of the administration of healthcare, certainly in the United States’ fragmented provider systems.  Third-party-contracted vendors which provide back office services utilized by private and public hospitals and doctors offices have already created and implement GAI platforms that catalog, maintain data and provide financial reporting tools governed by algorithms.  This technology is first step towards implementation of Artificial Intelligence (AI) into the broader range of patient services being discussed today.

The World Health Organization (WHO) recently pronounced that AI has tremendous promise to “delivery of healthcare and medicine worldwide;”[i] cementing the mission critical nature of AI utility.  Given the tremendous increasing financial cost of healthcare to patients, managed care organizer and taxpayers,[ii] AI has the promise to create efficiencies and reduce costs to the consumer and the federal government; the main provider bill payor.[iii]

For purposes of this discussion, AI refers to the theory development of computer systems capable of performing tasks that historically required human intelligence, such as recognizing speech, making decisions, and identifying patterns. AI is an umbrella term that encompasses a wide variety of technologies, including machine learning, deep learning, and natural language processing (NLP).[iv]

Importantly, the machine learning that has developed from the GAI hospitals currently use allows AI to make decisions and identify administrative patterns.  Thus, reducing time put into charting, billing and, in light of recent regulatory changes, compliance checks.

For example, the Centers for Medicare and Medicaid Services (CMS) recently finalized enhanced hospital price transparency requirements for 2024. With this update, hospital price transparency mandates will become stricter, reinforcing the regulations established in 2021. Now, hospitals must disclose charge information using a more prescriptive template.[v] Once AI has learned that template, it can immediately track billing to patients to alight align? with CMS regulations for a treatment or procedure.

CMS Medicare Billing Form CMS-1450 and 837 Institutional and policy manual are on CMS.gov.[vi]  In theory, machine learning AI technology can read the manual, notes from the federal government, and implement a compliant process for submission all while being HIPPA compliant.  Once done, the AI can learn to anticipate and evolve with changes that are issued by CMS.  The reduction in administrative costs and time to do the same will have immediate measurable impact.

Where the implementation of AI becomes more controversial is in the patient care space.  Particularly when laid out like this: Computer Vision AI, which is already being used in self-driving and other computer technology initially screens a patient from home and leverages machine learning, which has access to every medical textbook, research study and image on the internet.  Then a decision can be made about the next step of patient care.  Even in an emergency, with HIPPA compliant disclosures obtained, an ambulance could use the technology to prepare an ER or OR for more precise treatment. Predictive resource optimization.

Here is the catch 22: what about a hallucination? In a hallucination, AI generates something that’s not grounded in that data at all. It pretty much just makes something up in an effort to answer a query or prompt.[vii]   There have already been public instances of lawyers being sanctioned for using AI and citing to fake cases[viii] which, while bad for their clients, does not involve treatment for a health condition.  However, the professional standard of care for both lawyers and medical professionals requires an attentiveness and use of the same knowledge and skill of another practitioner.  There should be a level setting that professionals using AI are actively involved in the briefing or diagnostic process.  It should be noted that legal and medical malpractice happens without the use of AI.

A July John Hopkins 2023 study found an estimated 795,000 Americans become permanently disabled or die annually across care settings because dangerous diseases are misdiagnosed.[ix]. This research points to 15 commonly misdiagnosed health conditions (including the big three: vascular events, infection and cancer) that are responsible for over half of the annual deaths and severe disabilities – including brain damage, blindness, and limb amputations – related to diagnostic errors.[x]

Thus, it is hard not to imagine a reduction in misdiagnoses with machine learning that has access and ability to analyze research, historical data from case studies and other professional experiences within seconds. Even with this prospect, there is warranted concern surrounding patient privacy.  Once queries are inputted into AI, the information contained therein is available to all. Thus, any healthcare system considering the implementation of AI, must have a policy in writing.

An AI written policy should have the codified data privacy state statutes (i.e., California Consumer Protection Act, Children’s Online Privacy Act) as well as HIPPA compliance spelled out.  Because HIPPA has been in existence for over 20 years, the contours for protection of patient privacy are already in existence.  The utility of AI for administrative efficiencies and to streamline patient diagnoses should not create an undue burden to providers relative to privacy protection.

Another common concern raised is the potential for a cyber event disruption.  Cyber events are required to be reported under HIPPA[xi] and, with the growth of zero trust security expected in 2023, healthcare organizations must adopt a more proactive and preventive approach to cybersecurity that assumes no trust for any user, device, or network.[xii] Zero trust security involves implementing multiple layers of security controls and verification mechanisms, such as multifactor authentication (MFA), encryption, segmentation, micro-perimeters, identity and access management (IAM), endpoint detection and response (EDR), and continuous monitoring.[xiii]

Given the existing recognition of technology and compliance pain points that healthcare already faces and the steps being taken to actively address them; AI utility as a partner in efficient administration and patient interface does not seem like a negative.  Rather, it appears to be a productive path forward that may have tremendous positive economic and, most importantly, human health outcomes.

References:
[i] WHO issues first global report on Artificial Intelligence (AI) in health and six guiding principles for its design and use
[ii] Consultants from Mercer, Aon, and Willis Towers Watson predict employer healthcare costs will increase as much as 8.5% in 2024. This increase is fueled by medical inflation, growing demand for weight loss drugs, and greater access to gene therapies, plus the continued effects of deferred care during the pandemic.  https://the-alliance.org/unveiling-5-trends-shaping-healthcare-in-2024/#:~:text=Consultants%20from%20Mercer%2C%20Aon%2C%20and%20Willis%20Towers%20Watson,continued%20effects%20of%20deferred%20care%20during%20the%20pandemic.
[iii] Medicare: 46% of the typical hospital’s volume, Medicaid: 21% of the typical hospital’s volume. Fact Sheet: Hospital Costs Explained | AHA
[iv] What Is Artificial Intelligence? Definition, Uses, and Types | Coursera
[v] HHS Notice of Benefit and Payment Parameters for 2024 Final Rule | CMS
[vi] 100-02 | CMS
[vii] What Are AI Hallucinations (and What to Do About Them) (kapwing.com)
[viii] New York lawyers sanctioned for using fake ChatGPT cases in legal brief | Reuters
[ix] Burden of serious harms from diagnostic error in the USA | BMJ Quality & Safety
[x] Id.
[xi] Omnibus HIPAA Rulemaking | HHS.gov
[xii] 10 key trends and statistics in healthcare cybersecurity for 2023 (virtelligence.com)
[xiii] Id.

This piece was originally released on Law360.

Meet the Author

Headshot of Sarah Abrams.Sarah Abrams, Head of Claims

Baleen Specialty, a division of Bowhead Specialty

Sarah Abrams is the Head of Claims at Baleen Specialty, a division of Bowhead Specialty.  She built the Baleen Claims department and in her previously role as Head of Professional Liability Claims at Bowhead Specialty she oversaw the professional liability claims department handling of Director and Officer, Management Liability and Errors and Omissions claims. Sarah practiced law in Chicago, representing carriers, before moving in house.  She has authored numerous articles and is a regular speaker at insurance and legal industry events.

 

News Type

PLUS Blog

Business Line

Healthcare and Medical PL

Topic

Professional Liability (PL) Insurance

Contribute to

PLUS Blog

Contribute your thoughts to the PLUS Membership consisting of 38,000+ Professional Liability Practitioners.

Related Podcasts

Related Articles

Graphic that says, "Insurance 101 For Lawyers" webinar recap
August 19, 2024

Insurance 101 For Lawyers Webinar Recap

This webinar enforced the fact that in the realm of professional liability…

close up photo of a stethoscope
May 30, 2024

Why Healthcare Should Embrace AI

There is some argument that General Artificial Intelligence (GAI) has already been…

Group Photo from HCMedPL Symposium 2024.
May 22, 2024

2024 PLUS Healthcare & Medical PL Symposium – Most Successful Since 2008

The Professional Liability Underwriting Society (PLUS) welcomed nearly 450 healthcare liability professionals…